Installation

The Confluent Platform is available in a variety of formats. For most of the platform, including services like Kafka, Confluent Control Center, Kafka Connect, and Schema Registry, you should use one of our easy to install packages:

Docker images for the Confluent Platform are currently available on DockerHub. Alternatively, the source files for the images are available on Github if you would prefer to extend and/or rebuild the images and upload them to your own DockerHub repository. If you are interested in deploying with Docker, please refer to our full Docker image documentation.

Confluent does not currently support Windows. Windows users can download and use the zip and tar archives, but will have to run the jar files directly rather than use the wrapper scripts in the bin/ directory.

Confluent Platform also includes client libraries for multiple languages that provide both low level access to Kafka as well as higher level stream processing. These libraries are available through the native packaging systems for each language:

Confluent Platform Installation

Requirements

The only requirement is Oracle Java >= 1.7. Java installation varies by platform, so please check the JRE installation requirements for your platform. Before installing the Confluent Platform, double check your Java version:

$ java -version

Various installation options are listed in the sections below. Select the approach you prefer and follow the given instructions to install the Confluent Platform.

ZIP and TAR archives

These archives contain the jars, driver scripts, and configuration files in plain zip and tar archives. These should be used on OS X and as a simple way to evaluate the platform.

Start by downloading one of the archives. You can choose between Confluent Enterprise, which includes all of Confluent’s components and Confluent Open Source, which includes the open source parts of Confluent Platform. The complete list of downloads and their contents can be found on http://confluent.io/download/.

Next extract the contents of the archive.

Archive Type Command
Enterprise ZIP unzip confluent-3.1.2-2.11.zip
Open Source ZIP unzip confluent-oss-3.1.2-2.11.zip
Enterprise TAR tar xzf confluent-3.1.2-2.11.tar.gz
Open Source TAR tar xzf confluent-oss-3.1.2-2.11.tar.gz

Next, try the Quickstart to test out the Confluent Platform locally.

DEB packages via apt

The apt repositories provide packages for Debian-based Linux distributions such as Debian and Ubuntu.

First install Confluent’s public key, which is used to sign the packages in the apt repository.

$ wget -qO - http://packages.confluent.io/deb/3.1/archive.key | sudo apt-key add -

Add the repository to your /etc/apt/sources.list:

$ sudo add-apt-repository "deb [arch=amd64] http://packages.confluent.io/deb/3.1 stable main"

Run apt-get update and install Confluent Enterprise:

$ sudo apt-get update && sudo apt-get install confluent-platform-2.11

You can also choose to install Confluent Open Source:

$ sudo apt-get update && sudo apt-get install confluent-platform-oss-2.11

The number at the end of the package name specifies the Scala version. Currently supported versions are 2.11 (recommended) and 2.10. Individual components of the Confluent Platform are also available as standalone packages. See the Available Packages section for a listing of packages.

Next, try the Quickstart to test out the Confluent Platform locally.

RPM packages via yum

The yum repositories provide packages for RHEL, CentOS, and Fedora-based distributions.

First install Confluent’s public key, which is used to sign packages in the yum repository.

$ sudo rpm --import http://packages.confluent.io/rpm/3.1/archive.key

Add the repository to your /etc/yum.repos.d/ directory in a file named confluent.repo.

If you are using RHEL/Centos/Oracle 6

[Confluent.dist]
name=Confluent repository (dist)
baseurl=http://packages.confluent.io/rpm/3.1/6
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.1/archive.key
enabled=1

[Confluent]
name=Confluent repository
baseurl=http://packages.confluent.io/rpm/3.1
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.1/archive.key
enabled=1

If you are using RHEL/Centos/Oracle 7

[Confluent.dist]
name=Confluent repository (dist)
baseurl=http://packages.confluent.io/rpm/3.1/7
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.1/archive.key
enabled=1

[Confluent]
name=Confluent repository
baseurl=http://packages.confluent.io/rpm/3.1
gpgcheck=1
gpgkey=http://packages.confluent.io/rpm/3.1/archive.key
enabled=1

It is recommended to clear the yum caches before proceeding:

$ sudo yum clean all

The repository is now ready for use.

You can install Confluent Enterprise with:

$ sudo yum install confluent-platform-2.11

Or you can install Confluent Open Source with:

$ sudo yum install confluent-platform-oss-2.11

The number at the end of the package name specifies the Scala version. Currently supported versions are 2.11 (recommended) and 2.10. Individual components of the Confluent Platform are also available as standalone packages. See the Available Packages section for a listing of packages.

Next, try the Quickstart to test out the Confluent Platform locally.

Available Packages

Confluent Platform ships with two groups of packages:

  1. The platform packages of CP (confluent-platform-<scala_version> and confluent-platform-oss-<scala_version> ), which are umbrella packages to install all components of Confluent Enterprise or Confluent Open Source respectively , i.e. all the individual packages. See next point.
  2. The individual packages of the Confluent Platform such as confluent-kafka-<scala_version> and confluent-schema-registry.

Here, the platform packages (confluent-platform-<scala_version>) as well as the individual Kafka packages (confluent-kafka-<scala_version>) – and only these – are available in two variants, named by the respective Scala version of Apache Kafka that is included in the packages.

Note

Why Scala, and why different Scala versions? Apache Kafka is implemented in Scala and built for multiple versions of Scala. However, the Scala version only matters if you are actually using Scala for implementing your applications and you want a Kafka version built for the same Scala version you use. Otherwise any version should work, with 2.11 being recommended.

In CP 3.1.2, scala_version can be one of:

  • 2.11 (recommended) – giving confluent-platform-2.11 and confluent-kafka-2.11
  • 2.10 – giving confluent-platform-2.10 and confluent-kafka-2.10

If you choose to install via an apt or yum repository, you can install individual components instead of the complete Confluent Platform. Use these packages when you only need one component on your server, e.g. on production servers. The following are packages corresponding to each of the Confluent Platform components:

  • Umbrella Packages:

    • confluent-platform-<scala_version> – umbrella package that installs the complete Confluent Platform, i.e. all the individual packages mentioned below
    • confluent-platform-oss-<scala version> – umbrella package that installs only the open source Confluent Platform, i.e. all the individual packages mentioned below except confluent-control-center, confluent-replicator, confluent-rebalancer and confluent-support-metrics.
  • Enterprise Components:

    • confluent-control-center – GUI-based system for managing and monitoring Apache Kafka
    • confluent-kafka-connect-replicator – Multi-Datacenter Replication
    • confluent-support-metrics – Part of “Proactive Support” - improves Confluent’s support for the platform by collecting and reporting support metrics (“Metrics”) to Confluent
    • confluent-rebalancer – Automatically balance data and partitions between brokers in Apache Kafka cluster
  • Open Source Components:

    • confluent-kafka-<scala_version> – Apache Kafka, including Kafka Streams, Kafka Connect framework, and Kafka Java client

    • confluent-kafka-connect-hdfs – Hadoop HDFS Sink Connector for Kafka Connect

    • confluent-kafka-connect-jdbc – JDBC Source and Sink Connectors for Kafka Connect

    • confluent-kafka-connect-elasticsearch – Elasticsearch Connector for Kafka Connect (Sink)

    • confluent-schema-registry – Schema Registry

    • confluent-kafka-rest – HTTP REST Proxy for Apache Kafka

    • confluent-camus – Camus (deprecated)

    • librdkafka – C/C++ Kafka client

      • DEB packages

        • librdkafka1 – library
        • librdkafka-dev – development headers and library
        • librdkafka1-dbg – debugging symbols
      • RPM packages

        • librdkafka1 – library
        • librdkafka-devel – development headers and library
        • librdkafka1-debuginfo – debugging symbols
        • librdkafka – source rpm
      • Source packages - included in zip and tar archives under src/ directory

    • avro – C/C++ Avro serialization library

      • DEB packages

        • libavro-c1 – C library and source
        • libavro-cpp1 – C++ library and source
        • libavro-c-dev – C development headers and library
        • libavro-cpp-dev – C++ development headers and library
      • RPM packages

        • avro-c – C library and source
        • avro-cpp – C++ library and source
        • avro-c-tools – command line tools
        • avro-c-devel – C development headers and library
        • avro-cpp-devel – C++ development headers and library
        • avro-c-debuginfo – C debugging symbols
        • avro-cpp-debuginfo – C++ debugging symbols
      • Source packages - included in zip and tar archives under src/ directory

    • confluent-libserdes – C/C++ Avro Serialization with Schema Registry support

      • DEB packages

        • confluent-libserdes-c1 – library and source
        • confluent-libserdes-dev – development headers and library
      • RPM packages

        • confluent-libserdes – library and source
        • confluent-libserdes-devel – development headers and library
        • confluent-libserdes-debuginfo – debugging symbols
      • Source packages - included in zip and tar archives under src/ directory

    • confluent-kafka-python – Python client library

      • Python package – available on PyPI
      • Source packages – included in zip and tar archives under src/ directory
    • confluent-kafka-go – Go client library

      • Code repository – available on GitHub
      • Source packages – included in zip and tar archives under src/ directory

The following open source packages may also be installed automatically as dependencies:

  • confluent-common
  • confluent-rest-utils

Migrating from Confluent Open Source to Confluent Enterprise

The path for migrating to Enterprise version of Confluent depends on how you originally installed the Open Source version.

DEB Packages via apt or RPM Packages via yum

If you installed Confluent Open Source with DEB packages via apt or RPM packages via yum, the upgrade is very simple. Since the Enterprise packages are simply umbrella packages that contain everything you already have in Confluent Open Source plus additional enterprise packages, you can install the additional packages in your existing deployment by running:

$ sudo yum install confluent-platform-2.11

or

$ sudo apt-get install confluent-platform-2.11

TAR or ZIP archives

If you installed Confluent Open Source from TAR or ZIP archives, you will download and install a new Confluent Enterprise archive that contains the entire platform - both the open source and the enterprise components and start running Confluent from the new installation.

  • Download the Confluent Enterprise TAR or ZIP archive from http://confluent.io/downloads/.
  • Next extract the contents of the archive into a new Confluent install directory. For zip files, use a GUI to extract the

contents or run this command in a terminal:

$ unzip confluent-3.1.2-2.11.zip

For tar files run this command:

$ tar xzf confluent-3.1.2-2.11.tar.gz
  • Copy all configuration files from ./etc, including ./etc/kafka, ./etc/kafka-rest, ./etc/schema-registry, ./etc/confluent-control-center and ./etc/camus into the new directory you created in previous step.
  • Stop all Kafka services running in Confluent OpenSource directory. Depending on what was running, this will include kafka-rest, schema-registry, connect-distributed, kafka-server and zookeeper-server.
  • Start the corresponding services in Confluent Enterprise directory, and in addition start any new Enterprise services you wish you use - for example confluent-control-center.
  • Repeat these steps on all Confluent servers, one server at a time, to perform rolling migration.
  • If at any time you want to move back to Confluent OpenSource, simply stop the services in the Enterprise installation directory and start them in the OpenSource directory.

Clients

Maven repository for jars

All jars included in the packages are also available in the Confluent maven repository. Here’s a sample POM file showing how to add this repository:

<repositories>

  <repository>
    <id>confluent</id>
    <url>http://packages.confluent.io/maven/</url>
  </repository>

  <!-- further repository entries here -->

</repositories>

The Confluent maven repository includes compiled versions of Kafka. To reference the Kafka version 0.10.1.1 that is included with CP 3.1.2, use the following in your pom.xml:

<dependencies>

  <dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka_2.11</artifactId>
    <!-- For CP 3.1.2 -->
    <version>0.10.1.1-cp1</version>
  </dependency>

  <!-- further dependency entries here -->

</dependencies>

Note

Version names of Kafka in Apache vs. Kafka in Confluent Platform: Confluent always contributes patches back to the Apache Kafka open source project. However, the exact versions (and version names) being included in Confluent Platform may differ from the Apache artifacts when Confluent Platform and Apache Kafka releases do not align. In the case they do differ, we keep the groupId and artifactId identical but append the suffix -cpX to the version identifier of the CP version (with X being a digit) in order to distinguish these from the Apache artifacts.

For example, to reference the Kafka version 0.9.0.1 that is included with Confluent Platform 2.0.1, you would use the following snippet in your pom.xml:

<dependency>
  <groupId>org.apache.kafka</groupId>
  <artifactId>kafka_2.11</artifactId>
  <!-- For CP 2.0.1 -->
  <version>0.9.0.1-cp1</version>
</dependency>

Further artifacts are available. For example, to use Confluent’s open source serializers that integrate with the rest of the Confluent Platform you would include the following in your pom.xml:

<dependencies>

  <dependency>
    <groupId>io.confluent</groupId>
    <artifactId>kafka-avro-serializer</artifactId>
    <!-- For CP 3.1.2 -->
    <version>3.1.2</version>
  </dependency>

  <!-- further dependency entries here -->

</dependencies>

C/C++

The C/C++ client, called librdkafka, is available in source form and as precompiled binaries for Debian and Redhat based Linux distributions. Most users will want to use the precompiled binaries.

For Linux distributions, follow the instructions for Debian or Redhat distributions to setup the repositories, then use yum or apt-get to install the appropriate packages. For example, a developer building a C application on a Redhat-based distribution would use the librdkafka-devel package:

$ sudo yum install librdkafka-devel

And on a Debian-based distribution they would use the librdkafka-dev package:

$ sudo apt-get install librdkafka-dev

On macOS, the latest release is available via Homebrew:

$ brew install librdkafka

The source code is also available in the zip and tar archives under the directory src/.

Python

The Python client, called confluent-kafka-python, is available on PyPI. The Python client uses librdkafka, the C client, internally. To install the Python client, first install the C client including its development package, then install the library with pip (for both Linux and macOS):

$ sudo pip install confluent-kafka

Note that this will install the package globally for your Python environment. You may also use a virtualenv to install it only for your project.

Then in Python you can import and use the library:

from confluent_kafka import Producer

conf = {'bootstrap.servers': 'localhost:9092', 'client.id': 'test', 'default.topic.config': {'acks': 'all'}}
producer = Producer(conf)
producer.produce(topic, key='key', value='value')

See the clients documentation for more examples.

The source code is also available in the zip and tar archives under the directory src/.

Go

The Go client, called confluent-kafka-go, is distributed via GitHub and `gopkg.in http://labix.org/gopkg.in`_ to pin to specific versions. The Go client uses librdkafka, the C client, internally and exposes it as Go library using cgo. To install the Go client, first install the C client including its development package as well as a C build toolchain including pkg-config. On Redhat-based Linux distributions install the following packages in addition to librdkafka:

$ sudo yum groupinstall "Development Tools"

On Debian-based distributions, install the following in addition to librdkafka:

$ sudo apt-get install build-essential pkg-config git

On macOS using Homebrew, install the following:

$ brew install pkg-config git

Next, use go get to install the library:

$ go get gopkg.in/confluentinc/confluent-kafka-go.v0/kafka

Your Go code can now import and use the client. You can also build and run a small command line utility, go-kafkacat, to ensure the installation was successful:

$ go get gopkg.in/confluentinc/confluent-kafka-go.v0/examples/go-kafkacat
$ $GOPATH/bin/go-kafkacat --help

If you would like to statically link librdkafka, add the flag -tags static to the go get commands. This will statically link librdkafka itself so its dynamic library will not be required on the target deployment system. Note, however, that librdkafka’s dependencies that are linked statically (such as ssl, sasl2, lz4, etc) will still be linked dynamically and required on the target system. An experimental option for creating a completely statically linked binary is available as well. Use the flag -tags static_all. This requires all dependencies to be available as static libraries (e.g., libsasl2.a). Static libraries are typically not installed by default but are available in the corresponding -dev or -devel packages (e.g., libsasl2-dev).

See the clients documentation for code examples showing how to use the library.

The source code is also available in the zip and tar archives under the directory src/.