imflog / schema-registry-plugin Goto Github PK
View Code? Open in Web Editor NEWGradle plugin to interact with Confluent Schema-Registry.
License: Apache License 2.0
Gradle plugin to interact with Confluent Schema-Registry.
License: Apache License 2.0
Right now version bump and release is manual, we need to change that.
We could use a github action to do so ?
It would also be great to add the generated packages to the github repository (like this)
As in #71 but for JSON.
Hi - first thanks for the great project looking forward to using it successfully!!!
Second - I'm a moron and especially a gradle
moron.
For SOME reason when running a gradle task gradle REALLY wants to find the io.confluent:kafka-schema-registry-parent:7.3.1
pom at plugins.gradle.org
and not from the confluent maven repository.
First the error:
FAILURE: Build failed with an exception.
* What went wrong:
A problem occurred configuring project ':app'.
> Could not resolve all files for configuration ':app:classpath'.
> Could not resolve com.github.imflog:kafka-schema-registry-gradle-plugin:1.9.1.
Required by:
project :app > com.github.imflog.kafka-schema-registry-gradle-plugin:com.github.imflog.kafka-schema-registry-gradle-plugin.gradle.plugin:1.9.1
> Could not resolve com.github.imflog:kafka-schema-registry-gradle-plugin:1.9.1.
> Could not parse POM https://plugins.gradle.org/m2/com/github/imflog/kafka-schema-registry-gradle-plugin/1.9.1/kafka-schema-registry-gradle-plugin-1.9.1.pom
> Could not find io.confluent:kafka-schema-registry-parent:7.3.1.
Searched in the following locations:
- https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry-parent/7.3.1/kafka-schema-registry-parent-7.3.1.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Here's my build.gradle
:
/*
* This file was generated by the Gradle 'init' task.
*
* This generated file contains a sample Java application project to get you started.
* For more details take a look at the 'Building Java & JVM projects' chapter in the Gradle
* User Manual available at https://docs.gradle.org/8.0/userguide/building_java_projects.html
*/
plugins {
// Apply the java plugin to add support for Java
id 'java'
// Apply the application plugin to add support for building a CLI application in Java.
id 'application'
// Used for code formatting
// https://plugins.gradle.org/plugin/com.github.sherter.google-java-format
id 'com.github.sherter.google-java-format' version '0.9'
// parse arvo into java
id "com.github.davidmc24.gradle.plugin.avro" version "1.6.0"
// download schema from schema registry
id "com.github.imflog.kafka-schema-registry-gradle-plugin" version "1.9.1"
}
repositories {
mavenCentral()
maven {
url "https://packages.confluent.io/maven"
}
}
dependencies {
// Use JUnit Jupiter for testing.
testImplementation 'org.junit.jupiter:junit-jupiter:5.9.1'
testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine:5.9.1'
// This dependency is used by the application.
implementation 'com.google.guava:guava:31.1-jre'
implementation 'org.apache.kafka:kafka-streams:3.4.0'
implementation group: 'org.slf4j', name: 'slf4j-log4j12', version: '2.0.6'
implementation group: 'org.slf4j', name: 'slf4j-api', version: '2.0.6'
implementation 'org.apache.logging.log4j:log4j-slf4j-impl:2.19.0'
implementation 'org.apache.avro:avro:1.11.1'
implementation ('io.confluent:kafka-streams-avro-serde:7.3.1') {
exclude group: 'org.apache.kafka', module: 'kafka-clients'
}
}
schemaRegistry {
url = 'REG URL'
credentials {
username = '$SCHEMA_REGISTRY_USERNAME'
password = '$SCHEMA_REGISTRY_PASSWORD'
}
quiet = true
download {
// extension of the output file depends on the the schema type
subject('event_ingestion-value', 'app/src/main/avro', 'Event.avsc')
}
}
application {
// Define the main class for the application.
mainClass = 'com.example.App'
}
task runStream(type: JavaExec) {
mainClass = 'com.example.App'
classpath sourceSets.main.runtimeClasspath
}
tasks.named('test') {
// Use JUnit Platform for unit tests.
useJUnitPlatform()
}
// code formatting
tasks.withType(JavaCompile).configureEach { javaCompile ->
javaCompile.dependsOn 'googleJavaFormat'
options.compilerArgs << "-Xlint:deprecation"
options.encoding = 'UTF-8'
}
Thoughts??? THANKS ๐ Mark
The confluent maven plugin supports basic auth confluentinc/schema-registry#907, update plugin to support the same.
Hello! I love this plugin, BTW! Thank you for making it!
One feature I am interested in is the ability to switch schema registries depending on which environment I am in (using something like a gradle project property).
(Please excuse my lack of gradle file formatting, this is just a pseudocode idea.)
What do you think?
environment {
name = 'prod'
url = 'http://registry-url:8081/'
download {
// extension of the output file depends on the the schema type
subject('avroSubject', '/absolutPath/src/main/avro')
subject('protoSubject', 'src/main/proto')
subject('jsonSubject', 'src/main/json')
}
}
environment {
name = 'staging'
url = 'http://registry-url:8081/'
download {
// extension of the output file depends on the the schema type
subject('avroSubject', '/absolutPath/src/main/avro')
subject('protoSubject', 'src/main/proto')
subject('jsonSubject', 'src/main/json')
}
}
}```
Add a task for schema compatibility.
In the 5.5.X versions, when have to add SchemaReferences when calling the compatibility task and the register task.
A schema reference requires that we pass a version to the registry.
That would be great if we could retrieve the latest version by default. See if we do It here or if we push the modification in the schema-registry API directly.
Hi, 1st thank you so much for your work on the plugin ๐ฅ
I was wondering if the output of the testSchemasTask
is correct or if it's missing something, as it gives no information on the incompatibility found:
* What went wrong:
1 schemas not compatible, see logs for details.
> java.lang.Throwable (no error message)
The logs don't say anything. I tried with -i
, --debug
, --stacktrace
.
Is the confluent plugin output like this too? Or is this plugin missing to output something?
Thanks!
I am using Maven version of the Schema Registry and our Schema Registry is not with Basic Authentication protected but SSL Certificates.
Normally we are delivering the SSL Certificate Parameters with -Djavax.ssl.trustStore and -Djavax.net.ssl.trustStorePassword but it seems they have no effect on the Gradle Plugin.
Then I debugged the SchemaRegistryClient from Confluent and I discoverd that it needs the parameters in the form of
io.confluent\kafka-schema-registry-client\6.0.0\kafka-schema-registry-client-6.0.0.jar!\io\confluent\kafka\schemaregistry\client\CachedSchemaRegistryClient.class
Line 172:
`if (configs != null && !configs.isEmpty()) {
restService.configure(configs);
Map<String, Object> sslConfigs = configs.entrySet().stream()
.filter(e -> e.getKey().startsWith(SchemaRegistryClientConfig.CLIENT_NAMESPACE))
.collect(Collectors.toMap(
e -> e.getKey().substring(SchemaRegistryClientConfig.CLIENT_NAMESPACE.length()),
Map.Entry::getValue));
SslFactory sslFactory = new SslFactory(sslConfigs);
if (sslFactory != null && sslFactory.sslContext() != null) {
restService.setSslSocketFactory(sslFactory.sslContext().getSocketFactory());
}
}`
So they have to start with SchemaRegistryClientConfig.CLIENT_NAMESPACE
public static final String CLIENT_NAMESPACE = "schema.registry.";
but converting parameters in graddle.properties didn't helped
schema.registry.javax.ssl.trustStore
schema.registry.javax.ssl.trustStorePassword
If seems these parameters must be over Config Maps from the SchemaRegistryClient delivered in the constructor.
io.confluent\kafka-schema-registry-client\6.0.0\kafka-schema-registry-client-6.0.0.jar!\io\confluent\kafka\schemaregistry\client\CachedSchemaRegistryClient.class
Line 98:
public CachedSchemaRegistryClient( List<String> baseUrls, int identityMapCapacity, List<SchemaProvider> providers, Map<String, ?> originals) { this(new RestService(baseUrls), identityMapCapacity, providers, originals, null); }
The problem is Gradle Plugin ist only interested with the configuration parameter for Basic Authentication and does not deliver any other configuration.
fun downloadSchemas() { val errorCount = DownloadTaskAction( RegistryClientWrapper.client(url.get(), basicAuth.get()), project.rootDir, subjects.get() ).run() if (errorCount > 0) { throw GradleScriptException("$errorCount schemas not downloaded, see logs for details", Throwable()) } }
I think plugin needs additional configuration parameters
from class
org.apache.kafka\kafka-clients\2.6.0\kafka-clients-2.6.0.jar!\org\apache\kafka\common\config\SslConfigs.class
ssl.truststore.location
ssl.truststore.password
and maybe keystores
ssl.keystore.location
ssl.keystore.password
Do real test upon a schema registry.
To do so, we could use testcontainers to spin up a real registry where we could push schema and do better validation.
When we register new schemas, sometimes they refer to other schemas.
Typical example is for a user record that can contain an address record for example.
A potential implementation is available here : #11
By using the newly created KafkaHelper, we should be able to run the Integration test in a separate Gradle task and run with an env variable that specify the Kafka version.
We should at least run tests on versions 5, 6 and 7
The plugin won't work with avro-serializer:4.1.0 because it uses avro-serializer:3.2.1. How can I use this plugin with avro-serializer:4.1.0?
Add a task responsible of pushing a schema for a given subject.
Hi, this is more like a question.
Is it in the plans of the plugin to support some sort of way to sync the subjects? By that I mean: when we deprecate a Subject (like migrated to a new structure), so being forced to remove the deprecated ones manually (or doing a script apart from the plugin).
I use gradle 5.3.1, schema-registry-plugin 0.5.0, if I change the schemaRegistry.url, then the url address does not change until I turn off the cache in gradle
./gradlew registerSchemaTask --no-build-cache
This should improve readability and help with syntax parsing.
I was trying to use the plugin and one area that seemed a little cumbersome is that if you want to download a number of different schemas you have to list them all individually. As an improvement what do people think about being able to use a regex to specify the subject that we want to download?
I have created a simple change that supports this and will turn it into a PR in a few minutes to show you what I mean. I have made sure that the changes are backward compatible and while I know there are some areas that could be improved on it does currently work. Apart from the usual naming questions the following areas come to mind:
Be interested to know what people think. I'd find these changes useful.
Any plans to support protobuf and json schema (from confluent platform 5.5+)?
Check if the idl format is supported, if not, support it.
I have seen a strange behaviour whereby old versions of our schema have vanished from the schema registry.
How does this plugin increment the version? Is there a way it might have overwritten old versions?
It might be useful for download to automatically fetch all available references (it might works for upload, but implementation will be much harder)
Hi,
I am using this plugin to register and test my AVRO schemas.
My configuration looks more or less like the one shown in this example, but has more subjects and references:
schemaRegistry {
...
register {
subject('company', 'schemas/avro/company.avsc', 'AVRO')
.addLocalReference("Address", "schemas/avro/location-address.avsc")
subject('user', 'schemas/avro/user.avsc', 'AVRO')
.addReference('company', 'company', -1)
subject('location-address', 'schemas/avro/location-address.avsc', 'AVRO')
subject('location-latlong', 'schemas/avro/location-latlong.avsc', 'AVRO')
}
compatibility {
subject('company', 'schemas/avro/company_v2.avsc', 'AVRO')
.addLocalReference("Address", "schemas/avro/location-address.avsc")
subject('user', 'schemas/avro/user.avsc', 'AVRO')
.addReference('company', 'company', -1)
subject('location-address', 'schemas/avro/location-address.avsc', 'AVRO')
subject('location-latlong', 'schemas/avro/location-latlong.avsc', 'AVRO')
}
}
Due to the duplicate subject definition, I end up with a lot of duplicate configuration.
Keeping the subjects in sync in the register and compatibility task is error-prone.
To avoid this, I introduced a small abstraction (I only work with local references and AVRO):
data class Subject(
val name: String,
val filePath: String,
val localReferences: Map<String, String>
) {
fun toRegisterSubject() = RegisterSubject(
inputSubject = name,
file = filePath,
type = SchemaType.AVRO,
localReferences = localReferences.map {
LocalReference(name = it.key, path = it.value)
}.toMutableList()
)
fun toCompatibilitySubject() = CompatibilitySubject(
inputSubject = name,
file = filePath,
type = AVRO,
localReferences = localReferences.map {
LocalReference(name = it.key, path = it.value)
}.toMutableList()
)
}
And configure my register and compatibility task like this:
schemaRegistry {
val subjectA = Subject(...)
val subjectB = Subject(...)
compatibility {
subjects.addAll(
subjectA.toCompatibilitySubject(),
subjectB.toCompatibilitySubject()
)
}
register {
subjects.addAll(
subjectA.toRegisterSubject(),
subjectB.toRegisterSubject()
)
}
}
Would it be possible to add a way to declare subjects outside the specific blocks and reuse them?
Or do I miss something and there is already a way to avoid duplicate configuration?
If help is welcome, let me know. I am willing to contribute.
During development, I found that the extensions are not scoped to the schemaRegistry one.
We need to change this for the plugin to co-exists with other plugins.
Before version 1.0, it was possible to split an Avro schema into multiple files and pass all of them to a single subject()
call in the register
block.
The addReference()
DSL method is meant to replace this, but it only works if the external reference is registered as a standalone subject in the Schema Registry. This is not the case in many of our uses, where we simply move the definition of a nested record type into a separate Avro file for convenience. This is not possible with the new DSL.
It would be great if the previous set-of-Avro-files approach would be re-added to the DSL somehow, perhaps through an addInternalReference()
method.
We are running the compatibility test for a large repository of schemas.
The compatibility task creates many "non informative" log lines when schema is compatible.
I'd like to be able to decide if I want to see all logs or only logs for failure.
Specifically these 2 lines:
When I enable schema registry plugin it breaks build of my project with exception:
A problem occurred configuring root project 'demo'.
> Could not resolve all artifacts for configuration ':classpath'.
> Could not resolve com.github.imflog:kafka-schema-registry-gradle-plugin:1.4.0.
Required by:
project : > com.github.imflog.kafka-schema-registry-gradle-plugin:com.github.imflog.kafka-schema-registry-gradle-plugin.gradle.plugin:1.4.0
> Could not resolve com.github.imflog:kafka-schema-registry-gradle-plugin:1.4.0.
> Could not parse POM https://plugins.gradle.org/m2/com/github/imflog/kafka-schema-registry-gradle-plugin/1.4.0/kafka-schema-registry-gradle-plugin-1.4.0.pom
> Could not find io.confluent:kafka-schema-registry-parent:6.2.0.
Searched in the following locations:
- https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry-parent/6.2.0/kafka-schema-registry-parent-6.2.0.pom
If the artifact you are trying to retrieve can be found in the repository but without metadata in 'Maven POM' format, you need to adjust the 'metadataSources { ... }' of the repository declaration.
Plugins section:
plugins {
id 'org.springframework.boot' version '2.3.12.RELEASE'
id 'io.spring.dependency-management' version '1.0.11.RELEASE'
id "com.github.imflog.kafka-schema-registry-gradle-plugin" version "1.4.0"
id 'com.google.protobuf' version '0.8.16'
id 'java'
id 'idea'
}
Repository settings:
repositories {
mavenCentral()
maven {
url "https://packages.confluent.io/maven/"
}
}
Most likely it happens because I am trying to resolve plugin using plugin
keyword from gradle repository.
Gradle version: 6.9
Plugin versions checked: 1.4.0 & 1.3.0
I am trying to use the plugin to register schema, but seem to be having issues that the gradle plugin doesn't seem to honor or pass the proxy information to the client. I have tried every form of setting the proxy, command line args, env variables, gradle.properties, etc, but none are being used when the call is made through the CachedSchemaRegsitryClient from the maven plugin.
There is no breaking change (for now) in the schema-registry libraries.
It could be useful to:
When I tried to upgrade from 0.8.0 to 0.9.0 and above, I get the following error-
An exception occurred applying plugin request [id: 'com.github.imflog.kafka-schema-registry-gradle-plugin', version: '1.2.0']
Failed to apply plugin [id 'com.github.imflog.kafka-schema-registry-gradle-plugin']
Could not create an instance of type com.github.imflog.schema.registry.SchemaRegistryExtension_Decorated.
> Could not find any public constructor for class com.github.imflog.schema.registry.SchemaRegistryExtension_Decorated which accepts parameters [].
Could we have some guidance on upgrade? I did notice that SchemaRegistryExtension was refactored in 0.9.0.
Hi,
I'm getting the erro when add this plugin:
> Could not resolve all artifacts for configuration ':classpath'.
> Could not find io.confluent:kafka-schema-registry:5.0.0.
Searched in the following locations: https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry/5.0.0/kafka-schema-registry-5.0.0.pom
Required by:
project : > com.github.imflog.kafka-schema-registry-gradle-plugin:com.github.imflog.kafka-schema-registry-gradle-plugin.gradle.plugin:0.7.0 > com.github.imflog:kafka-schema-registry-gradle-plugin:0.7.0
I've tried to add the dependency implementation "io.confluent:kafka-schema-registry:5.0.0"
and worked fine.
Any ideas?
Thank you.
When running the registerSchemas
task no information about the IDs is printed/saved to a file for the published schemas and this is useful for configuring the event producer that uses the respective schemas.
We can get the id by querying the latest, but when publishing an older version of the schema the id no longer corresponds to the latest version.
As in #71 but for Protobuf
Change the Download extension to match the other ones.
Then we could refactor extensions (merge them into one).
Hi,
I would like to pass configuration for task testSchemasTask from the command line:
./gradlew :testSchemasTask -Purl='localhost:8081' -Pusername=user -Ppassword=password
Or maybe there already exists any other mechanism for injecting the configuration?
I also tried to decorate task with sth like that:
testSchemasTask.doFirst { environment 'SCHEMA_REGISTRY_URL', project.property("url") }
schemaRegistry {
url = System.env['SCHEMA_REGISTRY_URL']
credentials {
username = System.env['SCHEMA_REGISTRY_USERNAME']
password = System.env['SCHEMA_REGISTRY_PASSWORD']
}
...
}
But it also doesn't work. Gradle cannot resolve testSchemasTask. Could you give some advice?
Hi,
First of all thank you so much for your effort to support the schema-registry in gradle.
I am afraid that by adding the below config, I am getting an error message and I am not able to proceed forward.
buildscript {
repositories {
maven {
url "https://plugins.gradle.org/m2/"
}
}
dependencies {
classpath "com.github.imflog:kafka-schema-registry-gradle-plugin:0.7.0"
}
}
Error Message:
Could not find io.confluent:kafka-schema-registry:5.0.0.
Searched in the following locations:
- https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry/5.0.0/kafka-schema-registry-5.0.0.pom
- https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry/5.0.0/kafka-schema-registry-5.0.0.jar
- https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry/5.0.0/kafka-schema-registry-5.0.0.pom
- https://plugins.gradle.org/m2/io/confluent/kafka-schema-registry/5.0.0/kafka-schema-registry-5.0.0.jar
Reason being the kafka-schema-registry:5.0.0 is not available and I see only 5.2.1.
You can check the same in the below link.
https://mvnrepository.com/artifact/io.confluent/kafka-schema-registry
Let me know if you need any information.
Thanks,
Dilip Sundarraj.
It might be useful to specify a name for the downloaded protos as well as a path (already implemented).
Currently the test rely heavily on Wiremock to fake responses. It's great for edge case scenarios but for common operation it would be better to use a real Schema Registry.
We could use https://github.com/testcontainers/testcontainers-java with the Kafka module to do so.
Hi, first of all, thank you very much for your awesome work!
I have a question regarding the 'download' task: The task downloads the latest version of the schema per default which is an expected behaviour. Nevertheless, in my usecase I need to download older schema versions as well and was wondering if this is possible. I was not able to find a hint how to do that in the documentation. Supporting this feature or giving me a hint how to do that in the current version would be great.
Thank you
Ali
I would like to be able to have a CI pipeline for our avro schemas. The steps would be:
Compatibility check fails when a subject does not exist already in the schema registry. The error code that gets returned is:
> io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401
Can we modify the behavior so that compatibility passes if there is no pre-existing schema for that subject? I know that is not the way that the maven plugin currently behaves.
Hi,
I tried to use changing the standard config, but its not reading my values and always going to the same config (localhost:8081 and src/main/avro).
schemaRegistry {
url = 'http://xxxxx.com:8081/'
download {
output = 'src/main/avro/external'
subjects = ['subject-value']
}
}
I'd like to propose migrating to Gradle's task configuration avoidance as outlined here.
I'm trying to check compatibility of schemas that have been generated from Avro IDL files using davidmc24/gradle-avro-plugin.
Currently all plugin configuration in the schemaRegistry
block is executed immediately when running any Gradle task. Ideally this configuration would execute only when the task that requires the config executes.
The following does not work, as configuration happens before schema files are generated:
schemaRegistry {
url = "http://schema-registry.example.com:8081"
compatibility {
fileTree(dir: "build/generated-avro-avsc", include: "**/*.avsc").each { file ->
subject("subjectName", relativePath(file))
}
}
}
Instead I need to do this:
schemaRegistry {
url = "http://schema-registry.example.com:8081"
}
tasks.whenTaskAdded { task ->
if (task.name == "testSchemasTask") {
task.doFirst {
schemaRegistry {
compatibility {
fileTree(dir: "build/generated-avro-avsc", include: "**/*.avsc").each { file ->
subject("subjectName", relativePath(file))
}
}
}
}
}
}
Following the maven plugin, if the https://github.com/commercehub-oss/gradle-avro-plugin is present in the classpath, we should try to generate the classes from the local schemas.
Because why not ?
It would be good to have a directory with usage of the plugin (in Kotlin or Groovy):
Hi Florian,
I am trying to use "com.github.imflog.kafka-schema-registry-gradle-plugin", for pushing my schemas to my local schema registry, and I am getting "502" with the below error at the client side.
I suspect this to be some encoding issue during serialization, though I am not sure about this.
If you had encountered a similar issue, or if you are aware of this problem, kindly help me.
io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (number, String,
array, object, 'true', 'false' or 'null')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:170)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:188)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:245)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:237)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:232)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)
at com.github.imflog.schema.registry.register.RegisterTaskAction.registerSchema(RegisterTaskAction.kt:35)
at com.github.imflog.schema.registry.register.RegisterTaskAction.run(RegisterTaskAction.kt:21)
at com.github.imflog.schema.registry.register.RegisterSchemasTask.registerSchemas(RegisterTask.kt:29)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:704)
at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:671)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.run(ExecuteActionsTaskExecuter.java:284)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:301)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:293)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:175)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:91)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:273)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:258)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:67)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:145)
at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:49)
at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:33)
at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:50)
at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:43)
at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:29)
at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:134)
at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$3(CacheStep.java:83)
at java.util.Optional.orElseGet(Optional.java:267)
at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:36)
at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:96)
at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:89)
at java.util.Optional.map(Optional.java:215)
at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:52)
at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:91)
at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:57)
at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:119)
at org.gradle.api.internal.tasks.execution.ResolvePreviousStateExecuter.execute(ResolvePreviousStateExecuter.java:43)
at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:94)
at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:56)
at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:55)
at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:67)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:315)
at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:305)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:175)
at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:101)
at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:49)
at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
at java.lang.Thread.run(Thread.java:748)
Environment:
Sample configuration:
buildscript {
repositories {
maven {
url "http://packages.confluent.io/maven/"
}
mavenCentral()
}
dependencies {
classpath("org.jfrog.buildinfo:build-info-extractor-gradle:4+")
}
}
plugins {
id "java-library"
id "maven"
id "maven-publish"
id "com.commercehub.gradle.plugin.avro" version "0.9.1"
id "com.github.imflog.kafka-schema-registry-gradle-plugin" version "0.5.0"
}
ext {
build_version = System.getenv("VERSION_NUMBER") as String ?: "1.0.0"
}
group "com.sample.test"
version "${build_version}"
repositories {
jcenter()
mavenLocal()
maven {
url "http://packages.confluent.io/maven/"
}
mavenCentral()
}
dependencies {
api "org.apache.avro:avro:1.8.2"
}
schemaRegistry {
url = 'http://localhost:8081/'
register {
subject('mytopic', 'src/main/avro/mysample.avsc')
}
}
Avro schema used
{
"namespace": "com.mysample.schema",
"name": "mysample",
"type": "record",
"fields": [
{"name": "name", "type": "string"},
{ "name": "id", "type": "string" }
]
}
We have a use case, where we need to do custom serialization. As part of this, we need to make it compatible with the way confluent's schema-registry ecosystem works. i.e we need to encode the schema-id as the first four bytes of avro payload, followed by the payload itself.
Currently, when you download the schema, it does not get the metadata. It should not be too difficult to save schema-metadata, maybe in a different folder, as json files.
Currently, the schema-registry returns the following metadata when looking up schema by subject and version.
{
"subject": "io.github.ferozed.MySubject",
"version": 1,
"id": 9580,
"schema": "..."
}
So, the DownloadTaskAction
could save this json with the same filename as subjectName
in a different folder.
Any thoughts on this proposal?
For now we cannot mix local and remote dependencies.
If we use an ordered structure, we might be able to overcome the fact that some local dependencies may need a remote one and vice-versa.
That would requires some fetch using the registry before the global parsing (doable with the client).
In the mean time we should try the ZK less version of Kafka.
Hi there.
The maven plugin which this adapts supports specifying multiple schema registries.
Is this also available with this plugin?
Thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.