Giter Club home page Giter Club logo

iab-spiders-and-robots-java-client's Introduction

IAB Spiders And Robots Java Client

Build Status Release License

This is a Java 8 client library for the IAB/ABC International Spiders and Bots List (available separately).

The library is available from Maven Central, and is published under the Apache 2.0 License.

It uses Gradle as its build tool and contains a comprehensive set of JUnit tests.

Installation

Add into your project's pom.xml:

<dependency>
    <groupId>com.snowplowanalytics</groupId>
    <artifactId>iab-spiders-and-robots-client</artifactId>
    <version>0.2.0</version>
</dependency>

A Simple Example

Assume we have a HTTP request from the IP address: 128.101.101.101 with a user agent string: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:50.0) Gecko/20100101 Firefox/50.0. To perform a robot or spider check using this algorithm:

// A File object pointing to your ip_exclude_current_cidr.txt file
File ipFile = new File("/path/to/ip_exclude_current_cidr.txt");

// File objects pointing to your include and exclude lists
File excludeUaFile = new File("/path/to/exclude_current.txt");
File includeUaFile = new File("/path/to/include_current.txt");

// This creates the IabClient object, which should be reused across lookups.
IabClient client = new IabClient(ipFile, excludeUaFile, includeUaFile);

String useragent = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:50.0) Gecko/20100101 Firefox/50.0";
InetAddress ipAddress = InetAddress.getByName("128.101.101.101");
IabResponse iabResponse = client.check(useragent, ipAddress);

For more complex examples and step by step description, please, refer the library Wiki: Usage Of The Library

Quickstart

Assuming git, Vagrant and VirtualBox installed:

host$ git clone https://github.com/snowplow/iab-spiders-and-robots-java-client.git
host$ cd iab-spiders-and-robots-java-client
host$ vagrant up && vagrant ssh
guest$ cd /vagrant
guest$ ./gradlew clean build
guest$ ./gradlew test

Find out more

Copyright and License

IAB Spiders And Robots Java Client is copyright 2017-2020 Snowplow Analytics Ltd.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this software except in compliance with the License.

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

iab-spiders-and-robots-java-client's People

Contributors

chuwy avatar oguzhanunlu avatar v-ladynev avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

fwahlqvist

iab-spiders-and-robots-java-client's Issues

Fix deprecated Spring repository link

Build fails on master branch

$ ./gradlew clean build

FAILURE: Build failed with an exception.

* What went wrong:
A problem occurred configuring root project 'iab-spiders-and-robots-java-client'.
> Could not resolve all dependencies for configuration ':classpath'.
   > Could not resolve org.springframework.build.gradle:propdeps-plugin:0.0.7.
     Required by:
         project :
      > Could not resolve org.springframework.build.gradle:propdeps-plugin:0.0.7.
         > Could not get resource 'http://repo.spring.io/plugins-release/org/springframework/build/gradle/propdeps-plugin/0.0.7/propdeps-plugin-0.0.7.pom'.
            > Could not GET 'http://repo.spring.io/plugins-release/org/springframework/build/gradle/propdeps-plugin/0.0.7/propdeps-plugin-0.0.7.pom'. Received status code 403 from server: Forbidden

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

BUILD FAILED

Total time: 11.851 secs

We need to use https instead of http.

IPv6 Support

Using the Snowplow IAB enrichment built on this library, we're occasionally seeing bad rows from IPv6 traffic.

Unexpected error processing events: java.lang.IllegalArgumentException: Could not parse [2001:8003:619c:ef00:a0ee:d4e4:2972:58d6]
	at org.apache.commons.net.util.SubnetUtils.toInteger(SubnetUtils.java:201)
	at org.apache.commons.net.util.SubnetUtils.access$400(SubnetUtils.java:28)
	at org.apache.commons.net.util.SubnetUtils$SubnetInfo.isInRange(SubnetUtils.java:109)
	at com.snowplowanalytics.iab.spidersandrobotsclient.lib.internal.IpRanges.belong(IpRanges.java:78)
	at com.snowplowanalytics.iab.spidersandrobotsclient.IabClient.checkAt(IabClient.java:80)
	at com.snowplowanalytics.snowplow.enrich.common.enrichments.registry.IabEnrichment.performCheck(IabEnrichment.scala:198)
	at com.snowplowanalytics.snowplow.enrich.common.enrichments.registry.IabEnrichment.getIab(IabEnrichment.scala:233)
	at com.snowplowanalytics.snowplow.enrich.common.enrichments.registry.IabEnrichment.getIabContext(IabEnrichment.scala:218)
	at com.snowplowanalytics.snowplow.enrich.common.enrichments.EnrichmentManager$.enrichEvent(EnrichmentManager.scala:286)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1$$anonfun$apply$1$$anonfun$apply$2$$anonfun$apply$3.apply(EtlPipeline.scala:92)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1$$anonfun$apply$1$$anonfun$apply$2$$anonfun$apply$3.apply(EtlPipeline.scala:91)
	at scalaz.NonEmptyList$class.map(NonEmptyList.scala:23)
	at scalaz.NonEmptyListFunctions$$anon$4.map(NonEmptyList.scala:207)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1$$anonfun$apply$1$$anonfun$apply$2.apply(EtlPipeline.scala:91)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1$$anonfun$apply$1$$anonfun$apply$2.apply(EtlPipeline.scala:88)
	at scalaz.Validation$class.map(Validation.scala:112)
	at scalaz.Success.map(Validation.scala:345)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1$$anonfun$apply$1.apply(EtlPipeline.scala:88)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1$$anonfun$apply$1.apply(EtlPipeline.scala:85)
	at scala.Option.map(Option.scala:146)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1.apply(EtlPipeline.scala:85)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$$anonfun$1.apply(EtlPipeline.scala:82)
	at scalaz.Validation$class.map(Validation.scala:112)
	at scalaz.Success.map(Validation.scala:345)
	at com.snowplowanalytics.snowplow.enrich.common.EtlPipeline$.processEvents(EtlPipeline.scala:82)
	at com.snowplowanalytics.snowplow.enrich.spark.EnrichJob$.enrich(EnrichJob.scala:125)
	at com.snowplowanalytics.snowplow.enrich.spark.EnrichJob$$anonfun$5.apply(EnrichJob.scala:194)
	at com.snowplowanalytics.snowplow.enrich.spark.EnrichJob$$anonfun$5.apply(EnrichJob.scala:194)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
	at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:216)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038)
	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Prepare for release

  • Finalize release date in CHANGELOG - today, 3 April
  • Finalize version to 0.1.0
  • Squash git history back to "Initial commit"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.