Installation¶
The Confluent Platform is available in a variety of formats. For most of the platform, including services like Kafka, Confluent Control Center, Kafka Connect, and Schema Registry, you should use one of our easy to install packages:
- zip and tar archives – Recommended for OS X and the Quickstart
- deb packages via apt – Recommended for deploying services on Debian/Ubuntu
- rpm packages via yum – Recommended for deploying services on RHEL/CentOS/Fedora
Docker images for the Confluent Platform are currently available on DockerHub. Alternatively, the source files for the images are available on Github if you would prefer to extend and/or rebuild the images and upload them to your own DockerHub repository. If you are interested in deploying with Docker, please refer to our full Docker image documentation.
Confluent does not currently support Windows. Windows users can download and use
the zip and tar archives, but will have to run the jar files directly rather
than use the wrapper scripts in the bin/
directory.
Confluent Platform also includes client libraries for multiple languages that provide both low level access to Kafka as well as higher level stream processing. These libraries are available through the native packaging systems for each language:
Confluent Platform Installation¶
Requirements¶
The only requirement is Oracle Java >= 1.7. Java installation varies by platform, so please check the JRE installation requirements for your platform. Before installing the Confluent Platform, double check your Java version:
$ java -version
Various installation options are listed in the sections below. Select the approach you prefer and follow the given instructions to install the Confluent Platform.
Note that Confluent Platform uses the following ports by default - make sure they are open, so the platform components can communicate with each other, or modify the configuration of relevant components to use an available port:
Component | Default Port |
---|---|
Zookeeper | 2181 |
Apache Kafka brokers (plain text) | 9092 |
Schema Registry REST API | 8081 |
REST Proxy | 8082 |
Kafka Connect REST API | 8083 |
Confluent Control Center | 9021 |
ZIP and TAR archives¶
These archives contain the jars, driver scripts, and configuration files in plain zip and tar archives. These should be used on OS X and as a simple way to evaluate the platform.
Start by downloading one of the archives. You can choose between Confluent Enterprise, which includes all of Confluent’s components and Confluent Open Source, which includes the open source parts of Confluent Platform. The complete list of downloads and their contents can be found on http://confluent.io/download/.
Next extract the contents of the archive.
Archive Type | Command |
---|---|
Enterprise ZIP | unzip confluent-3.2.2-2.11.zip |
Open Source ZIP | unzip confluent-oss-3.2.2-2.11.zip |
Enterprise TAR | tar xzf confluent-3.2.2-2.11.tar.gz |
Open Source TAR | tar xzf confluent-oss-3.2.2-2.11.tar.gz |
Next, try the Quickstart to test out the Confluent Platform locally.
DEB packages via apt¶
The apt repositories provide packages for Debian-based Linux distributions such as Debian and Ubuntu.
First install Confluent’s public key, which is used to sign the packages in the apt repository.
$ wget -qO - http://packages.confluent.io/deb/3.2/archive.key | sudo apt-key add -
Add the repository to your /etc/apt/sources.list
:
$ sudo add-apt-repository "deb [arch=amd64] http://packages.confluent.io/deb/3.2 stable main"
Run apt-get update and install Confluent Enterprise:
$ sudo apt-get update && sudo apt-get install confluent-platform-2.11
You can also choose to install Confluent Open Source:
$ sudo apt-get update && sudo apt-get install confluent-platform-oss-2.11
The number at the end of the package name specifies the Scala version. Currently supported versions are 2.11 (recommended) and 2.10. Individual components of the Confluent Platform are also available as standalone packages. See the Available Packages section for a listing of packages.
Next, try the Quickstart to test out the Confluent Platform locally.
RPM packages via yum¶
The yum repositories provide packages for RHEL, CentOS, and Fedora-based distributions.
First install Confluent’s public key, which is used to sign packages in the yum repository.
$ sudo rpm --import http://packages.confluent.io/rpm/3.2/archive.key
Add the repository to your /etc/yum.repos.d/
directory in a file named
confluent.repo
.
If you are using RHEL/Centos/Oracle 6
[Confluent.dist]
name=Confluent repository (dist)
baseurl=http://packages.confluent.io/rpm/3.2/6
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.2/archive.key
enabled=1
[Confluent]
name=Confluent repository
baseurl=http://packages.confluent.io/rpm/3.2
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.2/archive.key
enabled=1
If you are using RHEL/Centos/Oracle 7
[Confluent.dist]
name=Confluent repository (dist)
baseurl=http://packages.confluent.io/rpm/3.2/7
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.2/archive.key
enabled=1
[Confluent]
name=Confluent repository
baseurl=http://packages.confluent.io/rpm/3.2
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.2/archive.key
enabled=1
It is recommended to clear the yum caches before proceeding:
$ sudo yum clean all
The repository is now ready for use.
You can install Confluent Enterprise with:
$ sudo yum install confluent-platform-2.11
Or you can install Confluent Open Source with:
$ sudo yum install confluent-platform-oss-2.11
The number at the end of the package name specifies the Scala version. Currently supported versions are 2.11 (recommended) and 2.10. Individual components of the Confluent Platform are also available as standalone packages. See the Available Packages section for a listing of packages.
Next, try the Quickstart to test out the Confluent Platform locally.
Available Packages¶
Confluent Platform ships with two groups of packages:
- The platform packages of CP (
confluent-platform-<scala_version>
andconfluent-platform-oss-<scala_version>
), which are umbrella packages to install all components of Confluent Enterprise or Confluent Open Source respectively , i.e. all the individual packages. See next point. - The individual packages of the Confluent Platform such as
confluent-kafka-<scala_version>
andconfluent-schema-registry
.
Here, the platform packages (confluent-platform-<scala_version>
) as well as the individual Kafka packages (confluent-kafka-<scala_version>
) – and only these – are available in two variants, named by the respective Scala version of Apache Kafka that is included in the packages.
Note
Why Scala, and why different Scala versions? Apache Kafka is implemented in Scala and built for multiple versions of Scala. However, the Scala version only matters if you are actually using Scala for implementing your applications and you want a Kafka version built for the same Scala version you use. Otherwise any version should work, with 2.11 being recommended.
In CP 3.2.2, scala_version
can be one of:
- 2.11 (recommended) – giving
confluent-platform-2.11
andconfluent-kafka-2.11
- 2.10 – giving
confluent-platform-2.10
andconfluent-kafka-2.10
If you choose to install via an apt or yum repository, you can install individual components instead of the complete Confluent Platform. Use these packages when you only need one component on your server, e.g. on production servers. The following are packages corresponding to each of the Confluent Platform components:
Umbrella Packages:
confluent-platform-<scala_version>
– umbrella package that installs the complete Confluent Platform, i.e. all the individual packages mentioned belowconfluent-platform-oss-<scala version>
– umbrella package that installs only the open source Confluent Platform, i.e. all the individual packages mentioned below except confluent-control-center, confluent-replicator, confluent-rebalancer and confluent-support-metrics.
Enterprise Components:
confluent-control-center
– GUI-based system for managing and monitoring Apache Kafkaconfluent-kafka-connect-replicator
– Multi-Datacenter Replicationconfluent-support-metrics
– Part of “Proactive Support” - improves Confluent’s support for the platform by collecting and reporting support metrics (“Metrics”) to Confluentconfluent-rebalancer
– Automatically balance data and partitions between brokers in Apache Kafka cluster
Open Source Components:
confluent-kafka-<scala_version>
– Apache Kafka, including Kafka Streams, Kafka Connect framework, and Kafka Java clientconfluent-kafka-connect-hdfs
– Hadoop HDFS Sink Connector for Kafka Connectconfluent-kafka-connect-jdbc
– JDBC Source and Sink Connectors for Kafka Connectconfluent-kafka-connect-elasticsearch
– Elasticsearch Connector for Kafka Connect (Sink)confluent-kafka-connect-s3
– S3 Sink Connector for Kafka Connectconfluent-schema-registry
– Schema Registryconfluent-kafka-rest
– HTTP REST Proxy for Apache Kafkaconfluent-camus
– Camus (deprecated)librdkafka
– C/C++ Kafka clientDEB packages
librdkafka1
– librarylibrdkafka-dev
– development headers and librarylibrdkafka1-dbg
– debugging symbols
RPM packages
librdkafka1
– librarylibrdkafka-devel
– development headers and librarylibrdkafka1-debuginfo
– debugging symbolslibrdkafka
– source rpm
Source packages - included in zip and tar archives under src/ directory
avro
– C/C++ Avro serialization libraryDEB packages
libavro-c1
– C library and sourcelibavro-cpp1
– C++ library and sourcelibavro-c-dev
– C development headers and librarylibavro-cpp-dev
– C++ development headers and library
RPM packages
avro-c
– C library and sourceavro-cpp
– C++ library and sourceavro-c-tools
– command line toolsavro-c-devel
– C development headers and libraryavro-cpp-devel
– C++ development headers and libraryavro-c-debuginfo
– C debugging symbolsavro-cpp-debuginfo
– C++ debugging symbols
Source packages - included in zip and tar archives under src/ directory
confluent-libserdes
– C/C++ Avro Serialization with Schema Registry supportDEB packages
confluent-libserdes-c1
– library and sourceconfluent-libserdes-dev
– development headers and library
RPM packages
confluent-libserdes
– library and sourceconfluent-libserdes-devel
– development headers and libraryconfluent-libserdes-debuginfo
– debugging symbols
Source packages - included in zip and tar archives under src/ directory
confluent-kafka-python
– Python client library- Python package – available on PyPI
- Source packages – included in zip and tar archives under src/ directory
confluent-kafka-go
– Go client library- Code repository – available on GitHub
- Source packages – included in zip and tar archives under src/ directory
confluent-kafka-dotnet
– .Net client library- Code repository – available on Github <github.com/confluentinc/confluent-kafka-dotnet>-
- Source packages – included in zip and tar archives under src/ directory
The following open source packages may also be installed automatically as dependencies:
confluent-common
confluent-rest-utils
Migrating from Confluent Open Source to Confluent Enterprise¶
The path for migrating to Enterprise version of Confluent depends on how you originally installed the Open Source version.
DEB Packages via apt or RPM Packages via yum
If you installed Confluent Open Source with DEB packages via apt or RPM packages via yum, the upgrade is very simple. Since the Enterprise packages are simply umbrella packages that contain everything you already have in Confluent Open Source plus additional enterprise packages, you can install the additional packages in your existing deployment by running:
$ sudo yum install confluent-platform-2.11
or
$ sudo apt-get install confluent-platform-2.11
TAR or ZIP archives
If you installed Confluent Open Source from TAR or ZIP archives, you will download and install a new Confluent Enterprise archive that contains the entire platform - both the open source and the enterprise components and start running Confluent from the new installation.
- Download the Confluent Enterprise TAR or ZIP archive from http://confluent.io/downloads/.
- Next extract the contents of the archive into a new Confluent install directory. For zip files, use a GUI to extract the
contents or run this command in a terminal:
$ unzip confluent-3.2.2-2.11.zip
For tar files run this command:
$ tar xzf confluent-3.2.2-2.11.tar.gz
- Copy all configuration files from ./etc, including
./etc/kafka
,./etc/kafka-rest
,./etc/schema-registry
,./etc/confluent-control-center
and./etc/camus
into the new directory you created in previous step. - Stop all Kafka services running in Confluent OpenSource directory. Depending on what was running, this will include kafka-rest, schema-registry, connect-distributed, kafka-server and zookeeper-server.
- Start the corresponding services in Confluent Enterprise directory, and in addition start any new Enterprise services you wish you use - for example confluent-control-center.
- Repeat these steps on all Confluent servers, one server at a time, to perform rolling migration.
- If at any time you want to move back to Confluent OpenSource, simply stop the services in the Enterprise installation directory and start them in the OpenSource directory.
Clients¶
Maven repository for jars¶
All jars included in the packages are also available in the Confluent maven repository. Here’s a sample POM file showing how to add this repository:
<repositories>
<repository>
<id>confluent</id>
<url>http://packages.confluent.io/maven/</url>
</repository>
<!-- further repository entries here -->
</repositories>
The Confluent maven repository includes compiled versions of Kafka.
To reference the Kafka version 0.10.2.1 that is included with CP 3.2.2,
use the following in your pom.xml
:
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<!-- For CP 3.2.2 -->
<version>0.10.2.1-cp2</version>
</dependency>
<!-- further dependency entries here -->
</dependencies>
Note
Version names of Kafka in Apache vs. Kafka in Confluent Platform:
Confluent always contributes patches back to the Apache Kafka open source project.
However, the exact versions (and version names) being included in Confluent Platform
may differ from the Apache artifacts when Confluent Platform and Apache
Kafka releases do not align. In the case they do differ, we keep the groupId
and artifactId
identical but append the suffix -cpX
to the version identifier
of the CP version (with X
being a digit) in order to distinguish these from
the Apache artifacts.
For example, to reference the Kafka version 0.9.0.1 that is included with
Confluent Platform 2.0.1, you would use the following snippet in your pom.xml
:
<dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka_2.11</artifactId> <!-- For CP 2.0.1 --> <version>0.9.0.1-cp1</version> </dependency>
Further artifacts are available. For example, to use Confluent’s open source serializers that
integrate with the rest of the Confluent Platform you would include the following
in your pom.xml
:
<dependencies>
<dependency>
<groupId>io.confluent</groupId>
<artifactId>kafka-avro-serializer</artifactId>
<!-- For CP 3.2.2 -->
<version>3.2.2</version>
</dependency>
<!-- further dependency entries here -->
</dependencies>
C/C++¶
The C/C++ client, called librdkafka, is available in source form and as precompiled binaries for Debian and Redhat based Linux distributions. Most users will want to use the precompiled binaries.
For Linux distributions, follow the instructions for Debian or Redhat distributions to setup the repositories, then use yum
or apt-get
to install the appropriate packages. For example, a developer building a C application on a Redhat-based
distribution would use the librdkafka-devel
package:
$ sudo yum install librdkafka-devel
And on a Debian-based distribution they would use the librdkafka-dev
package:
$ sudo apt-get install librdkafka-dev
On macOS, the latest release is available via Homebrew:
$ brew install librdkafka
The source code is also available in the zip and tar archives under the directory src/
.
Python¶
The Python client, called confluent-kafka-python, is available on PyPI. The
Python client uses librdkafka, the C client, internally. To install the Python client, first install the C client including its development package, then install the library with pip
(for both Linux and macOS):
$ sudo pip install confluent-kafka
Note that this will install the package globally for your Python environment. You may also use a virtualenv to install it only for your project.
Then in Python you can import and use the library:
from confluent_kafka import Producer
conf = {'bootstrap.servers': 'localhost:9092', 'client.id': 'test', 'default.topic.config': {'acks': 'all'}}
producer = Producer(conf)
producer.produce(topic, key='key', value='value')
See the clients documentation for more examples.
The source code is also available in the zip and tar archives under the directory src/
.
Go¶
The Go client, called confluent-kafka-go, is distributed via GitHub
and `gopkg.in http://labix.org/gopkg.in`_ to pin to specific versions. The Go client uses librdkafka, the C client,
internally and exposes it as Go library using cgo. To install the Go client, first install
the C client including its development package as well as a C build toolchain including
pkg-config
. On Redhat-based Linux distributions install the following packages in addition to librdkafka:
$ sudo yum groupinstall "Development Tools"
On Debian-based distributions, install the following in addition to librdkafka:
$ sudo apt-get install build-essential pkg-config git
On macOS using Homebrew, install the following:
$ brew install pkg-config git
Next, use go get
to install the library:
$ go get gopkg.in/confluentinc/confluent-kafka-go.v0/kafka
Your Go code can now import and use the client. You can also build and run a small command line utility, go-kafkacat
,
to ensure the installation was successful:
$ go get gopkg.in/confluentinc/confluent-kafka-go.v0/examples/go-kafkacat
$ $GOPATH/bin/go-kafkacat --help
If you would like to statically link librdkafka, add the flag -tags static
to the go get
commands. This will
statically link librdkafka itself so its dynamic library will not be required on the target deployment system. Note,
however, that librdkafka’s dependencies that are linked statically (such as ssl, sasl2, lz4, etc) will still be linked
dynamically and required on the target system. An experimental option for creating a completely statically linked binary is
available as well. Use the flag -tags
static_all
. This requires all dependencies to be available as static libraries (e.g., libsasl2.a
). Static libraries are
typically not installed by default but are available in the corresponding -dev
or -devel
packages (e.g.,
libsasl2-dev
).
See the clients documentation for code examples showing how to use the library.
The source code is also available in the zip and tar archives under the directory src/
.
.NET¶
The .NET client, called confluent-kafka-dotnet, is available on NuGet. Internally, the .NET client uses librdkafka, the C client. Precompiled binaries for librdkafka are provided via the dependent librdkafka.redist NuGet package for a number of popular platforms (win-x64, win-x86, debian-x64, rhel-x64 and osx).
To reference confluent-kafka-dotnet from within a Visual Studio project, run the following command in the Package Manager Console:
PM> Install-Package Confluent.Kafka
Note
The dependent librdkafka.redist package will be installed automatically.
To reference confluent-kafka-dotnet in a .NET Core project.json file, include the following reference in the dependencies section:
"dependencies": {
...
"Confluent.Kafka": "0.9.4"
...
}
and then execute the dotnet restore
command to restore project dependencies via NuGet.
confluent-kafka-dotnet targets frameworks net451 and netstandard1.3 and is supported on the .NET Framework version 4.5.1 and higher and .NET Core 1.0 (on Windows, Linux and Mac) and higher. We do not support confluent-kafka-dotnet on Mono.