Giter Club home page Giter Club logo

aliyun-maxcompute-data-collectors's Introduction

Aliyun MaxCompute Data Collectors

Build Status


This project is a group of bigdata plugins for exchanging data with aliyun maxcompute. The plugins contain flume-plugin,kettle-plugin,ogg-plugin and odps-sqoop.

Requirements

  • JDK 1.6 or later
  • Apache Maven 3.x

Building the Sources

Clone the project from github:

$ git clone https://github.com/aliyun/aliyun-maxcompute-data-collectors.git

Build the sources using maven:

$ cd aliyun-maxcompute-data-collectors
$ mvn clean package -DskipTests=true  -Dmaven.javadoc.skip=true

Plugin packages are under each plugin subproject's target directroy.

Usages

Please refer to Wiki of basic usages.

License

Licensed under the Apache License 2.0

aliyun-maxcompute-data-collectors's People

Contributors

cornmonster avatar dingxin-tech avatar idleyui avatar linus-lc avatar lulualibaba avatar lyman avatar oyz avatar pandacanfly avatar shujiewu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aliyun-maxcompute-data-collectors's Issues

java.lang.IllegalArgumentException: No factory for connector odps

ctions
2022-09-23T10:25:26.260+0800 INFO main com.facebook.presto.server.PluginManager -- Finished loading plugin /data/presto/data/plugin/teradata-functions --
2022-09-23T10:25:26.260+0800 INFO main com.facebook.presto.server.PluginManager -- Loading plugin /data/presto/data/plugin/tpcds --
2022-09-23T10:25:26.267+0800 INFO main com.facebook.presto.server.PluginManager Installing com.facebook.presto.tpcds.TpcdsPlugin
2022-09-23T10:25:26.273+0800 INFO main com.facebook.presto.server.PluginManager Registering connector tpcds
2022-09-23T10:25:26.274+0800 INFO main com.facebook.presto.server.PluginManager -- Finished loading plugin /data/presto/data/plugin/tpcds --
2022-09-23T10:25:26.274+0800 INFO main com.facebook.presto.server.PluginManager -- Loading plugin /data/presto/data/plugin/tpch --
2022-09-23T10:25:26.277+0800 INFO main com.facebook.presto.server.PluginManager Installing com.facebook.presto.tpch.TpchPlugin
2022-09-23T10:25:26.282+0800 INFO main com.facebook.presto.server.PluginManager Registering connector tpch
2022-09-23T10:25:26.283+0800 INFO main com.facebook.presto.server.PluginManager -- Finished loading plugin /data/presto/data/plugin/tpch --
2022-09-23T10:25:26.283+0800 INFO main com.facebook.presto.server.PluginManager -- Loading plugin /data/presto/data/plugin/ttl-fetchers --
2022-09-23T10:25:26.289+0800 INFO main com.facebook.presto.server.PluginManager Installing com.facebook.presto.nodettlfetchers.PrestoNodeTtlFetcherPlugin
2022-09-23T10:25:26.294+0800 INFO main com.facebook.presto.server.PluginManager Registering Ttl fetcher factory infinite
2022-09-23T10:25:26.294+0800 INFO main com.facebook.presto.server.PluginManager -- Finished loading plugin /data/presto/data/plugin/ttl-fetchers --
2022-09-23T10:25:26.305+0800 INFO main com.facebook.presto.metadata.StaticCatalogStore -- Loading catalog properties etc/catalog/odps.properties --
2022-09-23T10:25:26.307+0800 INFO main com.facebook.presto.metadata.StaticCatalogStore -- Loading catalog odps --
2022-09-23T10:25:26.307+0800 ERROR main com.facebook.presto.server.PrestoServer No factory for connector odps
java.lang.IllegalArgumentException: No factory for connector odps
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
at com.facebook.presto.connector.ConnectorManager.createConnection(ConnectorManager.java:212)
at com.facebook.presto.metadata.StaticCatalogStore.loadCatalog(StaticCatalogStore.java:123)
at com.facebook.presto.metadata.StaticCatalogStore.loadCatalog(StaticCatalogStore.java:98)
at com.facebook.presto.metadata.StaticCatalogStore.loadCatalogs(StaticCatalogStore.java:80)
at com.facebook.presto.metadata.StaticCatalogStore.loadCatalogs(StaticCatalogStore.java:68)
at com.facebook.presto.server.PrestoServer.run(PrestoServer.java:150)
at com.facebook.presto.server.PrestoServer.main(PrestoServer.java:85)

希望trino-connector能够支持trino 422

trino-connector目前只支持到trino 364,而我们线上使用的是trino 422,前者使用的是jdk 11,后者使用jdk 17。
我们想通过trino实现跨数据源的联邦查询,包括云上和云下

Sqoop don't export all data in a table

We are facing a problem:
We have a table in ODPS: 11 records but when we are using sqoop to export data it only exports 8 records.
The interesting is it always miss 3 records.

Sqoop could not parse record when exporting data from MaxCompute to PostgreSQL

Hi Ali,
I am using sqoop to export data from Maxcompute to Postgres.

./odps-sqoop/bin/sqoop export --connect jdbc:postgresql://localhost:5432/replication_db --table dim_wmp_cabinet \
    --username replication_user --password replication_pass \
    --odps-table dim_wmp_cabinet --odps-project xxx --odps-accessid xxx \
    --odps-tunnel-endpoint http://xxxx \
    --odps-partition-spec ds=20170916 \
    --odps-accesskey xxxx --odps-endpoint http://sxxx/api

I am looking into this code
OdpsExportMapper.java

try {
      odpsImpl.parse(val);
      context.write(odpsImpl, NullWritable.get());

I am success to add tunnel endpoint into source code but couldn't get through this one.

Please help fix this issue.

17/09/18 18:16:46 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
17/09/18 18:16:46 INFO mapreduce.Job: Running job: job_local873411290_0001
17/09/18 18:16:46 INFO mapred.LocalJobRunner: OutputCommitter set in config null
17/09/18 18:16:46 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.sqoop.mapreduce.NullOutputCommitter
17/09/18 18:16:46 INFO mapred.LocalJobRunner: Waiting for map tasks
17/09/18 18:16:46 INFO mapred.LocalJobRunner: Starting task: attempt_local873411290_0001_m_000000_0
17/09/18 18:16:46 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree currently is supported only on Linux.
17/09/18 18:16:46 INFO mapred.Task:  Using ResourceCalculatorProcessTree : null
17/09/18 18:16:46 INFO mapred.MapTask: Processing split: org.apache.sqoop.mapreduce.odps.OdpsExportInputFormat$OdpsExportInputSplit@6a6d595f
17/09/18 18:16:46 ERROR odps.OdpsExportMapper: Exception raised during data export
17/09/18 18:16:46 ERROR odps.OdpsExportMapper: Exception:
java.lang.RuntimeException: Can't parse input data: '3'
	at dim_wmp_cabinet.__loadFromFields(dim_wmp_cabinet.java:2090)
	at dim_wmp_cabinet.parse(dim_wmp_cabinet.java:1533)
	at org.apache.sqoop.mapreduce.odps.OdpsExportMapper.map(OdpsExportMapper.java:77)
	at org.apache.sqoop.mapreduce.odps.OdpsExportMapper.map(OdpsExportMapper.java:35)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
	at dim_wmp_cabinet.__loadFromFields(dim_wmp_cabinet.java:2010)
	... 13 more
17/09/18 18:16:46 ERROR odps.OdpsExportMapper: On input: com.aliyun.odps.data.ArrayRecord@4c665e0d
17/09/18 18:16:46 ERROR odps.OdpsExportMapper: At position 0
17/09/18 18:16:46 ERROR odps.OdpsExportMapper:

trino-connector odps表查询报错


id:string
user_name: string
数据内容:1,“xx”

Query 20230301_042651_00012_j7ivw is FAILED
2023-03-01T12:27:27.607+0800 DEBUG query-execution-23 io.trino.execution.QueryStateMachine Query 20230301_042651_00012_j7ivw failed
java.lang.NullPointerException
at com.aliyun.odps.cupid.trino.OdpsRecordCursor.getValueInternal(OdpsRecordCursor.java:65)
at com.aliyun.odps.cupid.trino.OdpsRecordCursor.isNull(OdpsRecordCursor.java:170)
at io.trino.spi.connector.RecordPageSource.getNextPage(RecordPageSource.java:96)
at io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:311)
at io.trino.operator.Driver.processInternal(Driver.java:388)
at io.trino.operator.Driver.lambda$processFor$9(Driver.java:292)
at io.trino.operator.Driver.tryWithLock(Driver.java:685)
at io.trino.operator.Driver.processFor(Driver.java:285)
at io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1078)
at io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
at io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:484)
at io.trino.$gen.Trino_364____20230301_040204_2.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)

Support http_proxy for sqoop

I am using Alicloud and it was configured to run behind http_proxy.
Without http_proxy I couldn't reach to MaxCompute.
I tried to set evn vars http_proxy, https_proxy but it doesn't work.

aliyun flume 插件配置问题

2.阿里flume插件的问题

a1.sources.r1.interceptors.i1.type = regex_extractor
a1.sources.r1.interceptors.i1.regex = ^(?:\n)?(\d\d\d\d-\d\d-\d\d\s\d\d:\d\d)
这两个配置是什么意思?

a1.sinks.k1.odps.endPoint =
a1.sinks.k1.odps.datahub.endPoint =
这两个配置是什么意思?如果直接将数据放到ODPS上,odps.datahub又有什么作用?

a1.sinks.k1.serializer.delimiter = ,
a1.sinks.k1.serializer.fieldnames = ,col1,col2,col3
现在按某种方式切分并不能满足我的需求,有什么方式可以将所有一条日志放到一个字段当中

【Spark-connector】datatype json not support

RT.
============= In MaxCompute ===============
CREATE TABLE json_table
(
json_val json
);

CREATE TABLE string_table
(
string_val STRING
);

INSERT INTO string_table VALUES
('{"a":1, "b":2}')
,('{"a":"key", "b":2}')
,('{"c":3}');

INSERT INTO json_table
SELECT json_parse(string_val)
FROM string_table;

============ In spark ================

select * from json_table with error " json not supportes"

presto333版本-所有带条件查询报超时

1.presto客户端返回超时,后台日志中报的也是超时,修改优化器超时时间也不行(Query 20230828_082025_00005_ztxc8 failed: The optimizer exhausted the time limit of 180000 ms)
2.jstack里面也有报错
ps:以下是报错截图
image
image

在kettle上创建mysql到odps的转换,当odps表有partition时,会抛出异常

在kettle上创建mysql到odps的转换,当odps表有partition时,会抛出异常
2017/08/02 09:16:15 - test-addition - Dispatching started for transformation [test-addition] 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Drop MaxCompute 目的表分区失败. 错误发生在项目:null 的表:kettle_test 的分区:columnc=1. 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : java.lang.RuntimeException: Drop MaxCompute 目的表分区失败. 错误发生在项目:null 的表:kettle_test 的分区:columnc=1. 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.OdpsUtil.dropPart(OdpsUtil.java:120) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.OdpsUtil.truncatePartition(OdpsUtil.java:103) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.OdpsUtil.dealTruncate(OdpsUtil.java:48) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.pentaho.di.trans.steps.odpsoutput.OdpsOutput.init(OdpsOutput.java:223) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at java.lang.Thread.run(Thread.java:745) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - Caused by: com.aliyun.odps.NoSuchObjectException: ODPS-0130161:Parse exception - line 1:12 mismatched input 'null' expecting Identifier near 'TABLE' in table name 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.rest.RestClient.handleErrorResponse(RestClient.java:369) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.rest.RestClient.request(RestClient.java:301) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.rest.RestClient.request(RestClient.java:251) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.rest.RestClient.stringRequest(RestClient.java:227) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.Instances.create(Instances.java:337) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.Instances.create(Instances.java:272) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.Instances.create(Instances.java:222) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.Instances.create(Instances.java:117) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.Table.runSQL(Table.java:1082) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.Table.deletePartition(Table.java:920) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - at com.aliyun.odps.OdpsUtil.dropPart(OdpsUtil.java:118) 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - ... 5 more 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - Caused by: com.aliyun.odps.rest.RestException: RequestId=598127E06B60FA292B0DA7AC,Code=NoSuchObject,Message=ODPS-0130161:Parse exception - line 1:12 mismatched input 'null' expecting Identifier near 'TABLE' in table name 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - ... 16 more 2017/08/02 09:16:16 - Aliyun MaxCompute Output.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Error initializing step [Aliyun MaxCompute Output] 2017/08/02 09:16:16 - test-addition - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Step [Aliyun MaxCompute Output.0] failed to initialize! 2017/08/02 09:16:16 - input.0 - Finished reading query, closing connection. Unable to prepare and initialize this transformation
异常中显示通过Partition获取到的project name为null,而在配置中已经配置过了。
如果自己按照源码中逻辑自己手动写代码创建odps对象获取Table的时候获取Partition和修改都是没问题的,不知道为什么打包成插件通过kettle调用就有问题。

Adding configuration for <tunnel_endpoint>xxx</tunnel_endpoint>

Objective: To be able to configure tunnel_endpoint on the pdi plugin.
Issue: Our current connection to service.ap-southeast-1.maxcompute.aliyun.com does not work unless we specify the tunnel endpoint as well but there is no such configuration available on the pdi plugin.
Please help. Thanks in advance :)

构建aliyun-odps-kettle-plugin组件失败

Reactor Summary:
[INFO]
[INFO] Open Data Processing Service ....................... SUCCESS [ 1.821 s]
[INFO] maxcompute.data.collectors.common .................. SUCCESS [ 13.113 s]
[INFO] odps-flume-plugin .................................. SUCCESS [ 18.087 s]
[INFO] aliyun-odps-kettle-plugin .......................... FAILURE [ 0.717 s]
[INFO] aliyun-datahub-ogg-plugin .......................... SKIPPED
[INFO] odps-sqoop ......................................... SKIPPED
[INFO] odps_data_dumper ................................... SKIPPED

[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] error reading /java/mvn/apache-maven-3.2.5/repo/swt/swt-win32/3.0m8/swt-win32-3.0m8.jar; error in opening zip file
[ERROR] error reading /java/mvn/apache-maven-3.2.5/repo/swt/swt-win32/3.0m8/swt-win32-3.0m8.jar; error in opening zip file

【presto-connector】Configuration property 'failure-detector.enabled' was not used

io.airlift.bootstrap.ApplicationConfigurationException: Configuration errors:

  1. Error: Configuration property 'failure-detector.enabled' was not used

1 error
at io.airlift.bootstrap.Bootstrap.initialize(Bootstrap.java:239)
at io.prestosql.server.testing.TestingPrestoServer.(TestingPrestoServer.java:297)
at io.prestosql.tests.DistributedQueryRunner.createTestingPrestoServer(DistributedQueryRunner.java:209)
at io.prestosql.tests.DistributedQueryRunner.(DistributedQueryRunner.java:139)
at io.prestosql.tests.DistributedQueryRunner$Builder.build(DistributedQueryRunner.java:593)

迁移该presto-connector代码至openlookeng1.4(开发odps连接器),测试运行过程出现如上错误,请教测试类正确写法是怎样哈?

public static void installConnector(QueryRunner runner) {
    Map<String, String> properties = new ImmutableMap.Builder<String, String>()
            .put("odps.project.name", "xxxx")
            .put("odps.access.id", "xxxx")
            .put("odps.access.key", "xxxx")
            .put("odps.end.point", "http://service.cn-hangzhou.maxcompute.aliyun.com/api")
            .put("odps.tunnel.end.point", "http://dt.cn-hangzhou.maxcompute.aliyun.com")
            .put("odps.input.split.size", "64")
            //.put("failure-detector.enabled", "true")
            .build();
    runner.installPlugin(new OdpsPlugin());
    runner.createCatalog("odps", "odps", properties);
}

关于odps的kettle插件问题

odps的kettle插件(全称:aliyun-kettle-odps-plugin)的input组件
image

其中输入的数据没有分区列的数据,不知哪位大佬给解答一下;

presto-connector module with "No factory for connector odps"

ERROR main com.facebook.presto.server.PrestoServer No factory for connector odps
java.lang.IllegalArgumentException: No factory for connector odps
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
at com.facebook.presto.connector.ConnectorManager.createConnection(ConnectorManager.java:194)
at com.facebook.presto.metadata.StaticCatalogStore.loadCatalog(StaticCatalogStore.java:96)
at com.facebook.presto.metadata.StaticCatalogStore.loadCatalogs(StaticCatalogStore.java:74)
at com.facebook.presto.server.PrestoServer.run(PrestoServer.java:132)
at com.facebook.presto.server.PrestoServer.main(PrestoServer.java:74)

trino connector访问MaxCompute报schema不存在

$ bin/trino-cli --server localhost:19090 --catalog odps;
trino> show schemas;
Schema

hyq_prd
information_schema
(2 rows)

Query 20231129_025900_00012_dw5sy, FINISHED, 1 node
Splits: 19 total, 19 done (100.00%)
0.22 [2 rows, 35B] [9 rows/s, 163B/s]

trino> use hyq_prd;
Query 20231129_025904_00013_dw5sy failed: Schema does not exist: odps.hyq_prd

trino> use information_schema;
USE
trino:information_schema>

odps.properties内容如下

connector.name=odps
odps.project.name=hyq_prd
odps.access.id=xxx
odps.access.key=xxx
odps.end.point=http://service.cn-shanghai.maxcompute.aliyun-inc.com/api
odps.input.split.size=64

明明show schemas时列出来的schema,use 语句执行报schema不存在
补充一点,odps项目没有开启三层模型,即没有申请开启schema公测

Cannot build the source code

Here, I'm trying to build the package from source codes, but get the error when execute the maven build

[ERROR] Failed to execute goal on project aliyun-odps-kettle-plugin: Could not resolve dependencies for project com.aliyun.odps:aliyun-odps-kettle-plugin:jar:2.0.2: Could not find artifact com.aliyun:maxcompute.data.collectors.common:jar:2.0.2 in pentaho-releases (http://repository.pentaho.org/artifactory/repo/) -> [Help 1]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.