Changelog¶
Version 3.4.0¶
Upgrade Notes¶
In 3.4.0, initial creation or validation of the topic used to store schemas has been reimplemented to use native Kafka protocol requests instead of accessing Zookeeper directly. This means that you are no longer required to have direct access to the Zookeeper cluster backing your Kafka cluster. However, note that this also requires appropriate permissions to create topics (on first execution of the Schema Registry) or describe topics and configs (on subsequent executions to validate the topic is configured correctly). If you have authentication and authorization enabled on your Kafka cluster, you must ensure your principal has the correct permissions before upgrading the Schema Registry cluster. Your principal must have the following permissions:
Create Schemas Topic
Operation | Resource | Reason |
---|---|---|
Describe | Topic: kafkastore.topic |
Check existence of topic |
Create | Cluster | Create the schemas topic, set compaction policy |
Validate Schemas Topic
Operation | Resource | Reason |
---|---|---|
Describe | Topic: kafkastore.topic |
Check existence of topic |
DescribeConfigs | Topic: kafkastore.topic |
Validate correct compaction policy on topic |
Version 3.3.0¶
- Upgrade avro to 1.8.2
- PR-561 - Use TO_AVRO_LOGICAL_CONVERTERS to convert default values that are Kafka connect logical data types to internal format which correspond to schema type. Logic copied from AvroData fromConnectData
- PR-549 - Replace usage of deprecated ZkUtils.DefaultAcls()
- Allow for some retries when validating that all nodes have the same master
- Relocate Avro serdes under a new avro package
- Increment Magic Byte for SchemaKey and add compatibility tests
- Add Delete Schema support
- Added avro-serde for Kafka Streams. Pulled from the example project.
- #506 - The AvroMessageFormatter passes byte[] to an Avro encoder, but Avro only likes ByteBuffer. So we need to ByteBuffer.wrap() instead.
- Added optional kafkastore.group.id config to override the one automatically created by the schema registry
- PR-476 - Adapt to KAFKA-4636 changes: Per listener security settings overrides (KIP-103)
Version 3.2.1¶
- PR-503 - CLIENTS-244: Update 3.2.0 changelog
- PR-499 - making sure schema registry doesn’t start with uncompacted topic
- PR-497 - CLIENTS-103: Fix ArrayIndexOutOfBoundsException in SchemaRegistryPerformance by counting the registration attempt even if it failed.
- PR-493 - Fixes for CLIENTS-257
- PR-494 - MINOR: Add compact schemas topic doc
- PR-458 - CLIENTS-104: Add a few retries during startup to allow for slow metadata propagation after creating the _schemas topic.
Version 3.2.0¶
- PR-428 - Maven Checkstyle
- PR-425 - Logical Type support
- PR-440 - Documentation changes to reflect pull request 415
- PR-451 - Generalize schema incompatibility message.
- PR-457 - Update ClusterTestHarness to use o.a.k.common.utils.Time.
- PR-458 - CLIENTS-104: Add a few retries during startup to allow for slow metadata propagation after creating the _schemas topic.
- PR-464 - Improve request URL building in client
- PR-465 - Add topic to error string to make debugging easier.
- PR-448 - Fixes the following avro issues (complex union, document preservation, output schema != input schema)
- PR-468 - CC-443: AvroData’s caches should be synchronized for thread-safety
- PR-473 - Fix build to work post KIP-103.
- PR-474 - Fix broker endpoint extraction to correctly translate to the non-ListenerName version that clients use to initiate broker connections.
- PR-472 - Handle primitive types when specific.avro.reader is true
- PR-477 - Docchangefor3.2
- PR-488 - Don’t re-invoke SchemaBuilder.version() and SchemaBuilder.name() if the value has already been set.
- PR-494 - MINOR: Add compact schemas topic doc
Version 3.1.1¶
No changes
Version 3.1.0¶
- PR-429 - Moving licenses and notices to a new format, generated by an internal script.
- PR-415 - Option to apply fully transitive schema compatibility checking
- PR-412 - Require bash since we use some bashisms and fix a copyright.
- PR-396 - ZooKeeper and Kafka SASL support.
- PR-384 - Update the link to Google Java code Style
- PR-372 - Update KafkaStore to use moved TopicExistsException class.
- PR-373 - Added get all subjects.
- PR-364 - Increase testing timeouts from 5000ms to 15000ms
- PR-346 - configured log4j to write to log file
Version 3.0.1¶
Version 3.0.0¶
- PR-212 - change the documentation on port to have a high priority and list it higher up in the docs
- PR-298 - Bump version to 3.0.0-SNAPSHOT and Kafka dependency to 0.10.0.0-SNAPSHOT
- PR-300 - Using the new 0.9 Kafka consumer.
- PR-302 - Fix build to handle rack aware changes in Kafka.
- PR-305 - Update to match changed methods in CoreUtils
- PR-317 - Change the ‘host.name’ importance to high
- PR-319 - KafkaStore SSL support.
- PR-320 - API reference uses ‘integer’ Avro type which isn’t supported. ‘int’ is supported.
- PR-329 - https support.
- PR-264 - Add null checks to json serializer/deserializer
- PR-274 - Add support for Avro projections in decoders
- PR-275 - Fixed references to confluent common version
- PR-276 - Unit tests and bugfix for NPE when using nested optional fields
- PR-278 - Test cases for optional nested structs
- PR-280 - Fix fromConnectData for optional complex types
- PR-290 - Issue #284 Cannot set max.schemas.per.subject due to cast exception
- PR-297 - Allows any CharSequence implementation to be considered a string
- PR-318 - Minor cleanup
- PR-323 - Fix #142 - handle parsing non-json responses in the RestService
- PR-332 - Add status storage topic to Connect Avro sample config.
Version 2.0.0¶
- PR-141 - Incorrect path to log4j.properties file for simple zip file layout
- PR-143 - schema-registry-start does not work with -daemon argument
- PR-152 - Added point about Google code style to Readme.
- PR-163 - Expose more information when registry fails to start
- PR-165 - Add compatibility support for SchemaRegistryClient
- PR-167 - Update the versionCache with a new schemaVersionMap for a subject
- PR-169 - Use correct URL to update compatibility setting of a subject
- PR-180 - GH-177: Remove unneeded content-type headers for GET ops in quickstart and README
- PR-184 - Support multiple registry urls in client
- PR-186 - Rename LocalSchemaRegistryClient to make it clear it is only intended to be used as a mock in tests. Fixes #185.
- PR-187 - Correct example response for POST /subjects/(string: subject)/versions
- PR-188 - Address GH-168; enable unit testing of CachedSchemaRegistryClient
- PR-195 - Fixed typo in Exception message
- PR-196 - Require Java 7
- PR-197 - Enable test code sharing
- PR-198 - Update jersey, jackson and junit versions to match rest-utils
- PR-202 - Correct minor docs error in example response
- PR-203 - Issue 194 rename main
- PR-207 - Issue #170: PUT /config/(string: subject) should return 4xx for unknown subjects
- PR-210 - Issue #208: Several RestApiTest test cases don’t test proper exception behavior
- PR-219 - Update Kafka version to 0.8.3-SNAPSHOT so we can start developing using upcoming 0.8.3 features.
- PR-237 - Update uses of ZkUtils to match changes made in KAFKA-2639.
- PR-240 - Fixes Typo in the docs. ‘actsas’ -> ‘acts as’
- PR-242 - Fixed schema registry build against kafka trunk
- PR-243 - Use x.y.z versioning scheme (i.e. 2.0.0-SNAPSHOT)
- PR-245 - Fix mvn assembly setup
- PR-252 - Use Kafka compiled with Scala 2.11
- PR-257 - Updated classpath in schema-registry-run-class to reflect changes in pom.xml
- PR-258 - CC-53: Add worker configs for Avro Kafka Connect that integrates with schema registry.
- PR-146 - Added decoder to return a specific Avro record from bytes.
- PR-162 - Add implementation of new Deserializer interface.
- PR-192 - Add new module for Kafka JSON serialization stuff
- PR-193 - Add more JSON codec support
- PR-200 - Switch serializer config classes to use AbstractConfig from confluent-common instead of from Kafka.
- PR-222 - Add AvroConverter in new copycat-avro-converter jar to convert Copycat and Avro data.
- PR-234 - Add AvroConverter support for Decimal, Date, Time, and Timestamp logical types.
- PR-235 - Add caching of schema conversions in AvroData and AvroConverter.
- PR-247 - Update for Copycat -> Kafka Connect renaming.
- PR-251 - Add tests of conversion of null values from Kafka Connect and fix handling of null for int8 and int16 types.
- PR-254 - Ensure AvroConverter passes through null values without adding schemas and that deserialized null values are converted to SchemaAndValue.NULL.
- PR-255 - CC-44: Include version when deserializing Kafka Connect data.
- PR-256 - Encode null values for schemaless array entries and map keys and values as an Anything record.