Skip to content

Commit 8f89e90

Browse files
committed
docs: Fixed links
1 parent 6ecf871 commit 8f89e90

File tree

7 files changed

+16
-16
lines changed

7 files changed

+16
-16
lines changed

docs/guide/guide.gradle

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ asciidoctor {
1313
jvmArgs("--add-opens", "java.base/sun.nio.ch=ALL-UNNAMED", "--add-opens", "java.base/java.io=ALL-UNNAMED")
1414
}
1515
attributes = [
16-
'source-highlighter': 'prettify'
16+
'source-highlighter': 'prettify'
1717
]
1818
}
1919

docs/guide/src/docs/asciidoc/_links.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
:link_releases: link:https://github.com/redis-field-engineering/redis-kafka-connect/releases[releases page]
12
:link_redis_enterprise: link:https://redis.com/redis-enterprise-software/overview/[Redis Enterprise]
23
:link_lettuce_uri: link:https://github.com/lettuce-io/lettuce-core/wiki/Redis-URI-and-connection-details#uri-syntax[Redis URI Syntax]
34
:link_redis_notif: link:https://redis.io/docs/manual/keyspace-notifications[Redis Keyspace Notifications]

docs/guide/src/docs/asciidoc/docker.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,11 @@ https://docs.docker.com/get-docker/[Docker]
1212

1313
== Run the example
1414

15-
Clone the https://github.com/{github-owner}/{github-repo}.git[{github-repo}] repository and execute `run.sh` in `docker` directory:
15+
Clone the link:{project-scm}[github repository] and execute `run.sh` in `docker` directory:
1616

1717
[source,console,subs="attributes"]
1818
----
19-
git clone https://github.com/{github-owner}/{github-repo}.git
19+
git clone {project-scm}
2020
./run.sh
2121
----
2222

docs/guide/src/docs/asciidoc/index.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,12 @@
66

77
include::{includedir}/_links.adoc[]
88

9-
:leveloffset: 1
9+
:leveloffset: +1
1010
include::{includedir}/introduction.adoc[]
1111
include::{includedir}/install.adoc[]
1212
include::{includedir}/connect.adoc[]
1313
include::{includedir}/sink.adoc[]
1414
include::{includedir}/source.adoc[]
1515
include::{includedir}/docker.adoc[]
1616
include::{includedir}/resources.adoc[]
17+
:leveloffset: -1

docs/guide/src/docs/asciidoc/install.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Select one of the methods below to install {project-title}.
55

66
== Download
77

8-
Download the latest release archive from https://github.com/{github-owner}/{github-repo}/releases[here].
8+
Download the latest release archive from the link:{project-url}/releases[releases page].
99

1010
== Confluent Hub
1111

@@ -14,4 +14,4 @@ Download the latest release archive from https://github.com/{github-owner}/{gith
1414

1515
== Manually
1616

17-
Follow the instructions in {link_manual_install}
17+
Follow the instructions in {link_manual_install}.

docs/guide/src/docs/asciidoc/sink.adoc

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,7 @@
11
[[_sink]]
22
= Sink Connector Guide
3-
:name: Redis Kafka Sink Connector
43

5-
The {name} consumes records from a Kafka topic and writes the data to Redis.
4+
The sink connector consumes records from a Kafka topic and writes the data to Redis.
65
It includes the following features:
76

87
* <<_sink_at_least_once_delivery,At least once delivery>>
@@ -12,17 +11,17 @@ It includes the following features:
1211

1312
[[_sink_at_least_once_delivery]]
1413
== At least once delivery
15-
The {name} guarantees that records from the Kafka topic are delivered at least once.
14+
The sink connector guarantees that records from the Kafka topic are delivered at least once.
1615

1716
[[_sink_tasks]]
1817
== Multiple tasks
1918

20-
The {name} supports running one or more tasks.
19+
The sink connector supports running one or more tasks.
2120
You can specify the number of tasks with the `tasks.max` configuration property.
2221

2322
[[_sink_data_structures]]
2423
== Redis Data Structures
25-
The {name} supports the following Redis data-structure types as targets:
24+
The sink connector supports the following Redis data-structure types as targets:
2625

2726
[[_collection_key]]
2827
* Collections: <<_sink_stream,stream>>, <<_sink_list,list>>, <<_sink_set,set>>, <<_sink_zset,sorted set>>, <<_sink_timeseries,time series>>
@@ -168,10 +167,10 @@ The Kafka record value must be a number (e.g. `float64`) as it is used as the sa
168167
[[_sink_data_formats]]
169168
== Data Formats
170169

171-
The {name} supports different data formats for record keys and values depending on the target Redis data structure.
170+
The sink connector supports different data formats for record keys and values depending on the target Redis data structure.
172171

173172
=== Kafka Record Keys
174-
The {name} expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:
173+
The sink connector expects Kafka record keys in a specific format depending on the configured target <<_sink_data_structures,Redis data structure>>:
175174

176175
[options="header",cols="h,1,1"]
177176
|====

docs/guide/src/docs/asciidoc/source.adoc

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,7 @@
11
[[_source]]
22
= Source Connector Guide
3-
:name: Redis Kafka Source Connector
43

5-
The {name} includes 2 source connectors:
4+
{project-title} includes 2 source connectors:
65

76
* <<_stream_source,Stream>>
87
* <<_keys_source,Keys>>
@@ -21,7 +20,7 @@ It includes the following features:
2120

2221
=== Delivery Guarantees
2322

24-
The {name} can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
23+
The stream source connector can be configured to ack stream messages either automatically (at-most-once delivery) or explicitly (at-least-once delivery).
2524
The default is at-least-once delivery.
2625

2726
[[_stream_source_at_least_once_delivery]]

0 commit comments

Comments
 (0)