Giter Club home page Giter Club logo

kafka-webview's Introduction

Kafka WebView

This project aims to be a full-featured web-based Apache Kafka consumer. Kafka WebView presents an easy-to-use web based interface for reading data out of kafka topics and providing basic filtering and searching capabilities.

Features

  • Connect to multiple remote Kafka Clusters.
  • Connect to SSL and SASL authenticated clusters.
  • Supports standard key and value deserializers.
  • Supports uploading custom key and value deserializers.
  • Supports both customizable and enforced filtering over topics.
  • Supports multiple user management and access control options:
    • Use in app defined users (default).
    • Use LDAP server for authentication and authorization.
    • Disable user authorization entirely (open anonymous access).
  • Web Based Consumer Supports:
    • Seeking to offsets.
    • Seeking to timestamps.
    • Filtering by partition.
    • Configurable server-side filtering logic.
  • "Live" web socket based streaming consumer.
  • Consumer group state monitoring

Screen Shots

Web Consumer

Web Consumer Screenshot

Streaming Consumer

Streaming Consumer Screenshot

Configuration

Configuration Screenshot

Installation from release distribution

Download the latest release package and extract the ZIP file.

Configuration

Edit the config.yml file at the root of the extracted package.

The server port can be modified to set what port Kafka WebView UI will bind to.

The app key should be modified to be unique to your installation. This key will be used for symmetric encryption of JKS/TrustStore secrets if you configure any SSL enabled Kafka clusters.

By default config.yml will look similar to:

server:
  ## What port to run the service on.
  port: 8080
  servlet:
    session:
      ## User login session timeout after 1 hour (3600 seconds)
      timeout: 3600


## Various App Configs
app:
  ## Should be unique to your installation.
  ## This key will be used for symmetric encryption of JKS/TrustStore secrets if you configure any SSL enabled Kafka clusters.
  key: "SuperSecretKey"

  ## Defines a prefix prepended to the Id of all consumers.
  consumerIdPrefix: "KafkaWebViewConsumer"
  
  ## Enable multi-threaded consumer support
  ## The previous single-threaded implementation is still available by setting this property to false.
  ## The previous implementation along with this property will be removed in future release.
  multiThreadedConsumer: true
  
  ## Sets upper limit on the number of concurrent consumers (non-websocket) supported.
  maxConcurrentWebConsumers: 32

  ## Sets upper limit on the number of concurrent web socket consumers supported.
  maxConcurrentWebSocketConsumers: 64

  ## Require SSL
  requireSsl: false

  ## User authentication options
  user:
    ## Require user authentication
    ## Setting to false will disable login requirement.
    enabled: true
    
    ## Optional: if you want to use LDAP for user authentication instead of locally defined users.
    ldap:
      ## Disabled by default.  See below for more details on how to configure.
      enabled: false

Starting the service

The Kafka WebView UI can be started by running the start.sh script from root of the extracted package. This should start a webserver running on the port you configured. Point your browser at http://your.host.name:8080 or the port that you configured and follow the Logging in for the first time instructions below.

Running from docker image

Docker images can be found on Docker Hub.

Start up the latest docker image by running docker run -it -p 8080:8080 -v kafkawebview_data:/app/data sourcelaborg/kafka-webview:latest

Point your browser at http://localhost:8080 and follow the Logging in for the first time instructions below.

Building from source

To build and run from the latest source code requires JDK 1.8 and Maven 3.3.9+. Clone this project and run the buildAndRun.sh script to compile the project and start the service.

Point your browser at http://localhost:8080 follow the Logging in for the first time instructions below.

Configure user authentication method

Kafka WebView supports three different methods for authenticating and authorizing users for access control.

Locally defined users

Using the default configuration, Kafka WebView will require users to login and access the app. These users are locally defined by an administrator user and managed within the application.

Your application yml file should be configured with the following options:

## App Configs
app:
  ## User authentication options
  user:
    ## Ensure user authentication is ENABLED
    enabled: true
    
    ## Ensure that LDAP authentication is DISABLED
    ldap:
      enabled: false

LDAP Authenticated users

Kafka WebView can be configured to authenticate users via an LDAP service. When LDAP authentication is enabled, you will no longer be able to manage users from within the application.

Your application yml file should be configured with the following options:

## App Configs
app:
  ## User authentication options
  user:
    ## Ensure user authentication is ENABLED
    enabled: true
    
    ## Ensure that LDAP authentication is ENABLED
    ldap:
      enabled: true
      
      ## Example values defined below, adjust as needed.
      ## How to find user records
      userDnPattern: "uid={0},ou=people"
      
      ## The attribute in which the password is stored.
      passwordAttribute: "userPassword"
      
      ## Where to find user group membership
      groupSearchBase: "ou=groups"
      groupRoleAttribute: "cn"
      groupSearchfilter = "(uniqueMember={0})"

      ## How passwords are validated, must implement PasswordEncoder interface
      passwordEncoderClass: "org.springframework.security.crypto.password.LdapShaPasswordEncoder"

      ## Comma separated list of groups. A user which is a member of this group will be granted
      ## administrator access to Kafka WebView.
      adminGroups: "ADMINGROUP1,ADMINGROUP2"

      ## Comma separated list of groups. A user which is a member of this group will be granted
      ## standard user level access to Kafka WebView.
      userGroups: "USERGROUP1,USERGROUP2"

      ## Any user who is not a member of at least one of the above groups will be denied access
      ## to Kafka WebView.

      ## URL/Hostname for your LDAP server
      url: "ldap://localhost:8389/dc=example,dc=org"

      ## If LDAP does not allow anonymous access, define the user/password to connect using.
      ## If not required, leave both fields empty
      bindUser: "cn=ManagementUser"
      bindUserPassword: "password-here"

Anonymous / Open access

Kafka WebView can also be configured for open and anonymous access.

Your application yml file should be configured with the following options:

## App Configs
app:
  ## User authentication options
  user:
    ## Ensure user authentication is DISABLED
    enabled: false

Reverse proxy setup

Kafka WebView can be configured to run behind a reverse proxy. See docs/httpdReverseProxySetup.md for detailed instructions.

Logging in for the first time

NOTE If you've disabled user authentication in your configuration, no login will be required. Skip directly to Step 2.

NOTE If you've enabled LDAP user authentication in your configuration, you will instead login with a user defined in your LDAP service that is a member of a group you've configured with admin user access. After successfully authenticating, skip directly to Step 2.

On first start up a default Administrator user will be created for you. Login using [email protected] with password admin

NOTE After logging in you should create your own Administrator user and remove the default account.

Setup

1. Setup users

You first need to configure who has access to Kafka WebView. Kafka WebView provides two roles for users: Admin and User.

  • Admin has the ability to Manage and Configure all aspects of WebView, including defining Kafka Clusters, adding/removing users, defining Views etc.
  • User has the ability to view Cluster information and consume previously defined Views.

NOTE If you've logged in with the Default Admin account, you'll want to create your own Administrator user account and remove the default one.

2. Connect Kafka clusters

You'll need to let WebView know about what Kafka clusters you want to connect.

WebView supports connecting to Clusters using plaintext or SSL. You'll need to follow the standard Kafka consumer client directions to create a Java Key Store (JKS) for your Trusted CA (TrustStore), and a JKS for your Consumer Key (KeyStore).

If authenticating to a cluster using SASL, you'll need to define your authentication method and JAAS configuration.

3. Configure custom Message Formats (Optional)

Kafka allows you to store data within the Cluster in any data-format and provides an Interface to define how clients should Deserialize your data. Out of the box Kafka WebView supports the following Deserializers that can be used for both Keys and Values:

  • ByteArray
  • Bytes
  • Double
  • Float
  • Integer
  • Long
  • Short
  • String

Often times data is stored using a custom format such as Avro or ProtocolBuffers. Admin users can upload a JAR containing custom Deserializer implementations to extend support to WebView to be able to properly deserialize your data format.

4. Configure Filters (Optional)

Filters are a construct unique to WebView. Filters allow you to implement an Interface that can be used on the server side to filter messages coming from Kafka. There are several benefits to doing filtering on the server side in this way. These can be used as a simple search-like filter and avoid passing large amounts of data to the client web browser when you're looking for a small subset of messages. Filters could also be used to enforce a restricted view of data from a Topic.

5. Define Views

Views are the last step putting all of the pieces together. Views let you configure a Topic to consume from, configure which Message Formats the Topic uses, and optionally apply any Filters.

Writing Custom Deserializers

The Deserializer Interface is provided by Kafka, WebView requires nothing special or additional above implementing this interface. If you already have a Deserializer implementation for consuming from Kafka then you simply can just use it as is.

If you don't already have an implementation, you can view the interface here.

Important Note: Kafka WebView will attempt to automatically convert objects returned from the Deserializer interface into a JSON representation for easy display in the browser (by way of Jackson). This process is imperfect -- If you want your objects to be rendered within the browser in a specific way, it is highly recommended that your Deserializer implementation returns a pre-formatted String instead of a complex object.

Writing Custom Filters

The RecordFilter Interface is provided by Kafka WebView and is NOT part of the standard Kafka library.

Example Deserializer and Filters Project

To get up and going quickly, the Kafka-WebView-Example project on GitHub can be cloned and used as a template. This Maven based example project is configured with all of the correct dependencies and has a few example implementations.

Releasing

Steps for performing a release:

  1. Update release version: mvn versions:set -DnewVersion=X.Y.Z
  2. Validate and then commit version: mvn versions:commit
  3. Update CHANGELOG and README files.
  4. Merge to master.
  5. Deploy to Maven Central: mvn clean deploy -P release-kafka-webview
  6. Build and upload new Docker images:
    • Edit Dockerfile and update version and sha1 hash.
    • docker build -t kafka-webview .
    • docker tag kafka-webview sourcelaborg/kafka-webview:latest
    • docker push sourcelaborg/kafka-webview:latest
    • docker tag kafka-webview sourcelaborg/kafka-webview:2.2.VERSIONHERE
    • docker push sourcelaborg/kafka-webview:2.2.VERSIONHERE
    • Commit updated docker files.
  7. Create release on Github project.

Changelog

The format is based on Keep a Changelog and this project adheres to Semantic Versioning.

View Changelog

kafka-webview's People

Contributors

blueicarus avatar bygui86 avatar crim avatar dependabot[bot] avatar lucrito avatar quentingodeau avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kafka-webview's Issues

Create Kafka Producer

I would like to know if you can offer the kafka producer feature, it would be great.

Allow for more control over consumer group used

Clusters that have authorization configured need more options around the consumer group Id (consumerIdPrefix), as ACL's are often based on the consumer group used.

For example, as it is now, the consumerIdPrefix is not sufficient because ACL's are linked to a specific/static consumer group Id and therefore a varying consumerIdPrefix + suffix combination does not work in this scenario.

Besides supporting a static consumer group Id, it would be great if a Cluster default consumer group Id could be provided when creating a new Cluster and potentially also override that consumer group per View.

Error "No serializer found for class..."

When deserializing from Kafka into objects that have no registered jackson serializer you get the error below.

We should handle this situation better by telling jackson to fall back to using that objects toString() method.

` Error Could not write JSON: No serializer found for class com.google.protobuf.UnknownFieldSet$Parser and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS); nested exception is com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class com.google.protobuf.UnknownFieldSet$Parser and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) (through reference chain: org.sourcelab.kafka.webview.ui.manager.kafka.dto.KafkaResults["results"]->java.util.Collections$UnmodifiableRandomAccessList[0]->org.sourcelab.kafka.webview.ui.manager.kafka.dto.KafkaResult["value"]

streamlined configuration options

Excellent tool, it really provides good insights, we're using it on several production environments because it's such a great addition to our Kafka setup. When we deploy it we first spin up the h2 db in server mode and use a Python script to execute SQL updates to set all the environment specific values (clusters, views, users, ...) before starting kafka-webview. This approach runs as part of our automated deploy chain and updating h2 this way just takes a few seconds to complete but it's only valid for the 1.0.5 specific h2 backend. A more streamlined approach would be welcome to guard against future updates or model changes.

Once again, thanks for this really great application!

Issue with the start.sh script

Hello,

When one uses the start.sh script, java jar tries to start the following jar: kafka-webview-ui-2.0.0-javadoc.jar instead of this one: kafka-webview-ui-2.0.0.jar.

See this line: https://github.com/SourceLabOrg/kafka-webview/blob/master/kafka-webview-ui/src/assembly/distribution/start.sh#L20

## launch webapp
java -jar kafka-webview-ui-*.jar $HEAP_OPTS $LOG_OPTS

I had to modify the script as follows:

## launch webapp
java -jar kafka-webview-ui-2.0.0.jar $HEAP_OPTS $LOG_OPTS

In order for the spring boot app to start.

Support running Kafka Webview behind reverse proxy

I am running Kafka Webview in a Kubernetes cluster and I'm trying to expose it publicly using Kubernetes Ingress (reverse proxy mechanism, backed by traefik).

I want to setup a reverse proxy rewrite rule that translates all external requests that matche the path prefix /dashboards/kafka to / on the service that is running Kafka Webview. This works for accessing the index page or the static resources, but the page links are incorrent and redirects fail, because your are using absolute paths in the JSP templates (e.g. the authentication procedure automatically redirects to /login, but should be login so it is externally reachable from /dashboards/kafka/login through the reverse proxy).

Another solution could be to use the Spring Boot server.servlet.configPath configuration property. Using this property I could set a prefix path for the Kafka Webview (dashboards/kafka) and let the reverse proxy redirect the paths as-is. For this to work, I guess you should add the variable ${pageContext.request.contextPath} before all href paths?

Or is there another easy workaround?

Initial run requires 2+ GB of RAM

Hey there.

Don't know if thats expected, but the app requires 2+GB of RAM available to boot.
If you set an upper limit of 2-3GB on it in Kubernetes/Docker, it crashes without booting.

webview-845ddb486d-sv6wj webview Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
webview-845ddb486d-sv6wj webview 2019-03-14 19:43:05.412 ERROR 8 --- [           main] o.s.boot.SpringApplication               : Application run failed
webview-845ddb486d-sv6wj webview
webview-845ddb486d-sv6wj webview org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'stompWebSocketHandlerMapping' defined in class path resource [org/springframework/web/socket/config/annotation/DelegatingWebSocketMessageBrokerConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.web.servlet.HandlerMapping]: Factory method 'stompWebSocketHandlerMapping' threw exception; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'messageBrokerTaskScheduler' defined in class path resource [org/springframework/web/socket/config/annotation/DelegatingWebSocketMessageBrokerConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.scheduling.concurrent.ThreadPoolTaskScheduler]: Factory method 'messageBrokerTaskScheduler' threw exception; nested exception is java.lang.IllegalArgumentException: 'poolSize' must be 1 or higher
webview-845ddb486d-sv6wj webview 	at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:591) ~[spring-beans-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]

That seems to be enormous ram to boot a web app

Extremely slow filtering

Not sure if this is known or not,

but I've got kafka-webview up and running with my data system. (just having the ability to view topics has been awesome!)

Our kafka topics typically contain millions of records serialized with protocol buffers.
When I attempted to write my own filter, as well as use the example string filter, I'm filtering approximately 5 records per second.

This makes filters virtually unusable for any topic with more than a handful of records.

Is this know? Are there any plans to improve this, or a known reason why? Also, perhaps its something on my end...

Also, I'm running the backend off a baremetal server, 12 physical cores, 65GB RAM. It doesnt seem to be using anywhere close to 10% of system resources.

User Permission to Create Views

First of all, great project. I just deployed it with a custom Apache Avro deserializer and everything works like a charm.

For my setup, I would like a little more flexibility in terms of user permissions. I do not want users to be able to modify topic settings, but I do want to give the ability to create views. This in effectively means I can not share admin access (since that gives too much power over cluster/topic configuration), but giving user permissions is not enough (since admins would have to create views for users). Is this something that could be changed easily? Happy to help with a PR if you send me some pointers.

Cluster connection issue with Docker

Hi @Crim ,

Thanks for this awesome project!
When I use the docker image to connect an unsecured Kafka server. All webview, kafka, zookeeper are deployed in the same VM in EC2.
I got below error.
WARN 9 --- [eration-UserId1] org.apache.klients.NetworkClient : Connection to node -1 could not be establ Broker may not be available.

Error connecting to cluster: org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.

But I can connect the Kafka server via command line tool.

Did I miss anything?

Thanks.

Open stream behind reverse proxy

Thanks to the work on #137 we are running kafka-webview behind a reverse proxy on our K8s clusters. It has been a tremendous help to quickly debug our flows!

Only hickup we encounter now is that the stream view/websocket still redirects to the old URL (not prefixed). When checking calls we see it performing a GET https://myhost/websocket/info?t=1550846293947 instead of a GET https://myhost/{configured pathPrefix}/websocket/info?t=1550846293947

Docker image

I would love to give this a spin, but it'd be amazing if it had a Docker image to quickly just get it up and running against existing setups

How to increase session time

Hi, I get automatically logged out it's very irritating when i'm watchin stream and suddenly its stop and when I'm refreshing page I got login page.

How to increase user session time ?

Consumer Switch to Stream Failed

Hello

Got the following Stack Trace when I try to switch to the stream view. The browser view is working fine !
And if I try to logout from my default user, I got the Error 500 Houston, we have a problem :) !

Thanks for your help

`2019-01-15 16:46:08.816 ERROR 33528 --- [oundChannel-109] .WebSocketAnnotationMethodMessageHandler : Unhandled exception from message handler method

java.lang.NullPointerException: null
at org.sourcelab.kafka.webview.ui.controller.stream.StreamController.getLoggedInUser(StreamController.java:189) ~[classes!/:2.1.2]
at org.sourcelab.kafka.webview.ui.controller.stream.StreamController.getLoggedInUserId(StreamController.java:182) ~[classes!/:2.1.2]
at org.sourcelab.kafka.webview.ui.controller.stream.StreamController.newConsumer(StreamController.java:130) ~[classes!/:2.1.2]
at org.sourcelab.kafka.webview.ui.controller.stream.StreamController$$FastClassBySpringCGLIB$$9d63246f.invoke() ~[classes!/:2.1.2]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204) ~[spring-core-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:746) ~[spring-aop-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:294) ~[spring-tx-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:98) ~[spring-tx-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185) ~[spring-aop-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:688) ~[spring-aop-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.sourcelab.kafka.webview.ui.controller.stream.StreamController$$EnhancerBySpringCGLIB$$65d5c314.newConsumer() ~[classes!/:2.1.2]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_191]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_191]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_191]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_191]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:181) ~[spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:114) ~[spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.handler.invocation.AbstractMethodMessageHandler.handleMatch(AbstractMethodMessageHandler.java:517) [spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.simp.annotation.support.SimpAnnotationMethodMessageHandler.handleMatch(SimpAnnotationMethodMessageHandler.java:495) [spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.simp.annotation.support.SimpAnnotationMethodMessageHandler.handleMatch(SimpAnnotationMethodMessageHandler.java:88) [spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.handler.invocation.AbstractMethodMessageHandler.handleMessageInternal(AbstractMethodMessageHandler.java:475) [spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.handler.invocation.AbstractMethodMessageHandler.handleMessage(AbstractMethodMessageHandler.java:411) [spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at org.springframework.messaging.support.ExecutorSubscribableChannel$SendTask.run(ExecutorSubscribableChannel.java:138) [spring-messaging-5.0.12.RELEASE.jar!/:5.0.12.RELEASE]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_191]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_191]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_191]`

Ability to Customize LDAP integration

Hey there,

Can you consider modifying LDAP configuration to provide more advanced custom LDAP search filters or options?

The issues that given default Spring configuration finds Groups by provided base path with a type of groupOfUniqueNames and looks for uniqueMemberId attributes to match the user.

For example we don't use type of groupOfUniqueNames for our groups so the LDAP configuration won't work.

Can we get it updated to have an option of providing full path for LDAP search/filter without looking for groups/users?

org.sourcelab.kafka.webview.ui.controller.api.exceptions.ApiException: Failed to construct kafka consumer

I have followed instructions in issue #81 to create a new format as per image below
image
But when trying to view the topic, I am getting "Failed to construct kafka consumer" message.
The view works to some extend when I am using String format instead of Avro so but obviously message is unreadable.

`2018-12-20 23:17:22.264 INFO 9 --- [nio-8080-exec-5] o.a.kafka.common.utils.AppInfoParser : Kafka version : 1.1.1
2018-12-20 23:17:22.264 INFO 9 --- [nio-8080-exec-5] o.a.kafka.common.utils.AppInfoParser : Kafka commitId : 98b6346a977495f6
2018-12-20 23:17:22.399 WARN 9 --- [nio-8080-exec-9] .m.m.a.ExceptionHandlerExceptionResolver : Resolved [org.sourcelab.kafka.webview.ui.controller.api.exceptions.ApiException: Failed to construct kafka consumer]

Using KafkaAvroDeserializer

Hi,

I'm trying to declare a new Message Format by declaring the io.confluent.kafka.serializers.KafkaAvroDeserializer class and uploading the kafka-avro-serializer-4.1.0.jar
The declaration failed. Class is not found.
Do you have an idea? Do you know ho to read topics serialized with Avro?

Regards

Antoine

Cluster and SSL config via file

Hi,

I'm using kafka-webview and I really like it so far. One big problem is, that most of the config has to be done via the UI. Is there any way to pre-configure clusters and the SSL config in the config.yml?

ERR_TOO_MANY_REDIRECTS chrome Error

when trying to use start.sh from the release bin package, the app is up and running but chrome is showing this error message ERR_TOO_MANY_REDIRECTS when using sslrequire true in the config file

Add ability to set arbitrary kafka consumer properties when defining a cluster.

Problem

When defining a cluster and its connection properties, only a subset of common properties are exposed in the UI. More advanced options such as disabling server host name verification are definitely nice to have and could be added into the UI. But undoubtably there are 100 other use cases that also require setting various configuration properties.

Currently there exists no way to do advanced configuration of connection properties without committing code to change the UI.

Possible Solution

Provide a general way to supply additional configuration properties when defining a cluster. Perhaps we add an "Advanced" tab that allows from selecting known configuration options, as well as the option to define your own keys and values.

This should allow for future proofing the application as well as supporting currently unknown (to the dev team) use cases. It also doesn't prevent us from adding a more user friendly UI for more common settings.

Improve 'cluster explorer' pages

Lets improve the 'cluster explorer' pages in the app. I think we can do a better job visualizing a cluster, its topics, and partition distribution.

Also add some screenshots to the README.

LDAP Integration does not use Bind

It looks like the current LDAP integration of kafka-webview wants to retrieve the user password property and compare locally. This is a somewhat odd requirement, since most LDAP servers are setup to not return the password, especially not in a hashed and salted form that corresponds to the Spring LDAP encoder.

Instead, most projects make use of the LDAP bind functionality, which gives the server authority to make the authentication decision.

SASL Authentication

Greeting! Is it possible to use kafka-webview with a cluster with a SASL authenticaton? In simple console consumer I can do this by just passing jaas conf(with login/password) file as JVM argument like

export KAFKA_OPTS="-Djava.security.auth.login.config=/home/nkm/Apps/kafka_2.11-2.0.0/config/jaas_client.conf"

and consumer.property file with lines

security.protocol=SASL_PLAINTEXT
sasl.mechanism=PLAIN

as cmd arguments, like..

bin/kafka-console-consumer.sh --bootstrap-server localhost:9029092 --topic test_topic --from-beginning --consumer.config config/consumer.properties

Crash listing topics when filter available

If I add a filter (I was using the example ones unmodified), then go to create a view, after I have selected my cluster the application crashes and no topics are available in the list. If I do not have any filters, then it all works fine.

The stack trace is very long, but I did spot this, don't know if it is relevant

ERROR 23754 --- [nio-7070-exec-1] org.thymeleaf.TemplateEngine : [THYMELEAF][http-nio-7070-exec-1] Exception processing template "configuration/view/create": An error happened during template parsing (template: "class path resource [templates/configuration/view/create.html]")

org.thymeleaf.exceptions.TemplateInputException: An error happened during template parsing (template: "class path resource [templates/configuration/view/create.html]")

and later....

Caused by: org.attoparser.ParseException: Exception evaluating SpringEL expression: "filterParameters.containsKey(filter.id)" (template: "configuration/view/create" - line 352, col 45)

and later....

ERROR 23754 --- [nio-7070-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.thymeleaf.exceptions.TemplateInputException: An error happened during template parsing (template: "class path resource [templates/configuration/view/create.html]")] with root cause

I got the success message when setting up the filter. I used the StringSearchFilter, specified where to find the jar file (I used the 'with-dependencies' one), and set Classpath to examples.filter.StringSearchFilter

auto set offset of view to tail to avoid burrow alert

When using burrow, it will alert because webview managed topic offset is not updated automatically.

Can you please add a scheduler to consume the topic to tail please?

BTW, webview do not use group.id, so in manage tools, I can not operate it. Can you please add an option to allow admin to set the group.id?

Thanks!
Jiming

search for topic

Hi, your app is really cool. What I miss is a possibility to search and remove topics. When testing we are dealing with many topics and this would make life more easier. Thanks!

Issue with Docker version of Kafka Webview and MacOS

Hello,

I noticed that the docker version of Kafka Webview is cannot reach my local Kafka cluster on my MacOS machine. Is this a know issue ? It might have to do with the /etc/hosts file but I am not sure.

The issue does not occur with the zipped version.

P.S. Kudos on for this nice software by the way.

Unable to view consumer group

Hi,
I have configured kafka-webview to connect my Kafka cluster. I am seeing an issue is that the consumer groups pertaining to the topic is being viewed in Cluster DEV Consumer Groups section on my Cluster Explorer page. However, I have configured Kafka Tool desktop UI in my local which is showing all the consumer groups and messages nicely. Is there anything I am missing or need to be configured in the configuration yaml files?

Thanks
Sujit

Got 500 when creating new message format

Hi @Crim ,

Thanks for this awesome project!
I am trying to play around with this tool and had two custom deserializer ready for test purpose.
I am able to add one of them successfully, but for the other one always got 500.
The two custom deserializer both implement the kafka deserilize interface but with different logic inside.

Exceptions in the log is as below:
2018-01-25 15:10:01.375 ERROR 18792 --- [nio-8080-exec-9] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.thymeleaf.exceptions.TemplateInputException: Error resolving template "/configuration/messageFormat/create", template might not exist or might not be accessible by any of the configured Template Resolvers] with root cause

org.thymeleaf.exceptions.TemplateInputException: Error resolving template "/configuration/messageFormat/create", template might not exist or might not be accessible by any of the configured Template Resolvers
at org.thymeleaf.engine.TemplateManager.resolveTemplate(TemplateManager.java:870) ~[thymeleaf-3.0.7.RELEASE.jar!/:3.0.7.RELEASE]
at org.thymeleaf.engine.TemplateManager.parseAndProcess(TemplateManager.java:607) ~[thymeleaf-3.0.7.RELEASE.jar!/:3.0.7.RELEASE]
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1098) ~[thymeleaf-3.0.7.RELEASE.jar!/:3.0.7.RELEASE]
at org.thymeleaf.TemplateEngine.process(TemplateEngine.java:1072) ~[thymeleaf-3.0.7.RELEASE.jar!/:3.0.7.RELEASE]
......

Any clue about this issue?

Question: Problems with custom deserializer, UnknownFieldSet

After creating a custom deserializer, and uploading into the app, I get the following when trying to view a View which uses that message format.

 Error Type definition error: [simple type, class com.google.protobuf.UnknownFieldSet$Parser]; nested exception is com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class com.google.protobuf.UnknownFieldSet$Parser and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) (through reference chain: org.sourcelab.kafka.webview.ui.manager.kafka.dto.KafkaResults["results"]->java.util.Collections$UnmodifiableRandomAccessList[0]-

Is there anyway to avoid this? #75 looks like it tried to mitigate this, but im still seeing it on master HEAD

For SASL+SSL clusters do not require KEYSTORE or KEYSTORE PASSWORD

Created from #109 and #105

Issue
Currently creating a connection to a SASL+SSL cluster requires providing a KEYSTORE and KEYSTORE PASSWORD. These are not required for connecting to a SASL+SSL cluster. Only the TRUSTSTORE and TRUSTSTORE PASSWORD.

Suggested Fix
No longer require these fields when creating or updating a cluster connection in this scenario.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.