Giter Club home page Giter Club logo

jsk_aerial_robot's Introduction

Build Status

This is for aerial robot, especially for transformable aerial robot as shown in following figure.

uav_intro

Setup

source /opt/ros/${ROS_DISTRO}/setup.bash # please replace ${ROS_DISTRO} with your specific env variable, e.g., melodic
mkdir -p ~/ros/jsk_aerial_robot_ws/src
cd ~/ros/jsk_aerial_robot_ws
sudo rosdep init
rosdep update
wstool init src
wstool set -u -t src jsk_aerial_robot http://github.com/jsk-ros-pkg/jsk_aerial_robot --git
wstool merge -t src src/jsk_aerial_robot/aerial_robot_${ROS_DISTRO}.rosinstall
wstool update -t src
rosdep install -y -r --from-paths src --ignore-src --rosdistro $ROS_DISTRO
catkin build

Demo

Please check instruction in wiki.

jsk_aerial_robot's People

Contributors

chibi314 avatar github-actions[bot] avatar greenpepper123 avatar harukikozukapenguin avatar kanke-hayata avatar keitaito123 avatar knorth55 avatar li-jinjie avatar makit0sh avatar sugihara-16 avatar sugikazu75 avatar tongtybj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jsk_aerial_robot's Issues

dragon type

The kinematics will change by both desire coordinate and the joint states.
The best way is to update the kinematics individually, but the current code only update by joint states.
Right now, it is OK since the update rate of LQI is slower than joint states and desire coordinate,
LQI < joint states = desire coordinate. But it will potential problem.

gazebo上のhydrusのモデルを直す

足の部分等がおかしい。対になっているものが対象な位置でなく同じ側に表示されている?
見栄えがあまりよくないのでそのうち直したい。

[Sensor Fusion] new estimation model

  1. Time synchronizable estimation model
    Can give a better result for the measurement sensor value which has delay (e.g. Optical Flow, GPS)
    Please check bd4da61

  2. XY Roll/Pitch Bias EKF model
    40455a add new EKF model which estimates [pos_x, vel_x, pos_y, vel_y, roll_bias, pitch_bias] at the same time. Combine with

[GPS] vel control

On 2017/5/12, the gps based flight control is tested.
Although the flight was stable, there was a bug that the flight is based on the pos_world_based_control_mode.
So the position control was using the low accurate position value estimated from gps.
It is considered that the velocity estimate from gps was more reliable than the latitude/longitude data from gps. This bug is fixed by commit : 5fe1c71
Therefore we have to check the vel_world_based_control_mode again with the same indoor gain(0.4)

On the other hand, the position control demonstrated the stable flight to some degree even under the strong wind environment. The indicates the feasibility of the pos-control based on the GPS. Test the waypoint code.

hydruxの機体修理

quadタイプでの実験中で、複数のプロテクターリングとプロペラが破損した。
交換が必要。
#76 の2と同時に行う

robotモデルの微妙な変更方法について

hydrus_hexにカメラがついたモデルを作りたい時、hydrus_hex_with_camera.urdf.xacroを作ることになるが、こうするとaerial_robot_model.launchでロードするときにtype:=hydrus_hex_with_cameraとするだけでは$(arg type)_init.yamlなどにも対応する必要があるためロードできない。

このために各ファイルを作りなおすのはよくない(同内容のファイルが大量にできるため)。

Outdoor Experiment

  1. GPS:
    test: acc without bias estimation
    test form: o, s(non yaw)
    test control mode: vel (or pos)

  2. Optical Flow
    test: kalman filter
    test form: same
    test control mode: vel

  3. VO
    test: kalman filter
    test form: same
    test control mode: pos

Better system for the state estimation

  • 状態量は 6DOFの**を2階微分まで格納しており、Baselink(FCU/IMU attached)とCoG座標でそれぞれ記述している。さらに、egomotion, experiment, ground_truthの3モードに分けている。詳細はbasic_state_estimation.hを確認すべし。

  • 基本、状態推定はBaselink座標で行い、その結果とkinematicsを用いて、CoG座標での**を求める。飛行制御はCoG座標で行うようにしている。

タスク:

  1. aerial_robot_estimationに移す

  2. 現在の状態推定器はrosベースであるため、リアルタイム性は保障されているない。加えて、複数のプロセスからアクセスがあるため、mutexを使っているが、その排他処理も雑である。もっといい方法を考える。

two way to calcualte the integral term(offset) for alt

  1. integrate the pos_err, then multiply by gain
  2. integrate the intermediate result of pos_err * gain.

We have been using the first way, which may cause the sudden falling down during the aerial transformation, however, the second one may cause the x/y position offset during the transformation.
Trade off.

Yaw D termの上限問題

YawのD項はd_boardの方で行っていて、
その上限はかなり恣意的に決められている。
https://github.com/tongtybj/d_board/pull/21

今は3[N]相当になっているが、gazebo上で変形するときにガクンと落ちるのはこのD項が上限にぶつかってしまっているためである。この問題に対しては、まず変形の速度を落とせばいい。

[uav_odom] seperate the root_link and cog odom

right now, the point of uav odometry is ambiguous:

  • position: the origin of root_link which is attached with imu(fc)
  • orientation: the frame of CoG

#55 unified the point of odom to root_link.

And we have to add the publisher for the point of uav CoG.

Publish all link's information

@tongtybj
Currently, we have link2's global position, velocity, orientation /uav/state and joint's angle, angle velocity /hydrus3/joint_states being published.

If we could also publish all the links' position, velocity, orientation, it would be much more convenient.

level approximation for the attitude control

As mentioned in this paper,
the angular velocity of the uav in {COG} frame is approximated equal to the differential result of Euler angles. This is based on the assumption that the {COG} attitude is around the level state.

state estimationのframe問題

今はlocal/globalが完全に分かれてていて、このようなわけかたはcontrolにも影響を及ぼしている。
本来であれば, local,global同時に推定するべきだと思う。

擬似逆行列による古典制御の4自由度制御器

ETHのomni-directionalで使われている
擬似逆行列による対角化を行い。
最適制御との比較を行う。

心配: 各フィードバックゲインの調整が定性的すぎる。入力値が推力に変換したとき、推力限界に達する恐れがある。やってみないとわからない。

慣性主軸ではない座標での姿勢制御

従来の手法は慣性主軸を求めてから、逆数を求めて線形状態方程式を解いているけど、
そもそも、状態方程式自体が干渉系なので、慣性主軸の意味はない。逆行列を求めればいい。

さらに、CoGの座標系の軸とFCUのものと一致させた方が、制御が楽になる。

#63#40 と関係する。

重心座標での制御に統一する

今は静的変形を過程で、
FCU_frameのyaw, position = CoG_frameのyaw.position
になっているけど、厳密ではない。

特にsimulation/実機でo -> lに変形中、最後の時、yawは大きくズレル。

たぶんここの問題だが、
そもそも1. COG_frameの向きを慣性主軸方向に合わせる必要はないうえに、
2. 非線形なジャイロ項が補償されていないから

この問題がおきているのかも。

[sensor fusion] update用センサ値の遅延問題

問題:
optical flowやGPSは処理に時間がかかるため、
update時はimuを使ったpredictプロセスより遅れてくる. それでfusionが悪くなっている可能せいがある。

解決:

  1. 昔のqueue式にするか
  2. ethzasl_sensor_fusionをマスターする.

ethzasl_sensor_fusionはtimestampの同期以外にも、より正確なモデルを使っている(#49)

8リンクのhydrusだとtransform controllerが落ちる

@tongtybj
とりあえず8リンクのHydrus(まずは開リンク,閉リンク化はあとで考える)を作って
roslaunch hydrus_transform_control hydrusx_bringup.launch simulation:=True headless:=False model:=octa_closed
したら,transform controllerが落ちます.
printfデバッグで,おそらくhamiltonMatrixSolverの中で死んでいるということまでわかりました

How to set root link

@KahnShi

root_link is the link attached with the main flight controller and IMU board. Therefore, the root link is usually the middle link in the real machine. The root link play the important role in both attitude estimation and flight control.

I thought there is a bug in my code, since the UAV can not fly if we set different root link from default setting.

In fact, there is no bug. The point is that the root link counted from 1 but not 0, which is based on the description rule of URDF.

two parameters should modified when you want to change the root link:
Hydrus.yaml: the config file for the transform control. Note that this param is also important for real machine. This parameter can be considered as the control part.
SimulationControl.yaml: the config file for the gazebo parameter. This param is responsible for the location of virtual IMU board in the gazebo env.

Two parameters should be identical, e.g. if we set the first link as the root link,
please set 1 to both parameters.

[servo] Turn off if crushing to the ground

The impact to the servo when crushing to the ground is very big, and may broke the motor inside.
So we have to turn off the servo and shift to the passive mode as quick as possible in this emergency case.

Look up to the sensor feedback for the imu and also the altitude velocity, as well as the force landing flag.

Mavlink with kinetic version

aerial_robot_comm is based on mavlink with indigo version, so right now we fix the version of mavlink by adding the repository link in .rosinstall #20. In the future, we have to update code in aerial_robot_comm to following mavlink with kinetic version.

[Sensor Fusion] baselink based sensor fusion and COG estimation

  • We have to strictly fuse the different sensor in the identical frame: baselink
  1. IMU: the FCU board should be exactly at the origin the baselink
  2. Mocap: the centroid should be exactly at the origin the baselink
  3. Altitude Range Sensor / vo: set as close as possible to the IMU-FCU baord, then consider the offset into accout for the baselink sensor fusion
  4. Optical Flow / GPS: velocity estimation, do not compensate yet.
  • CoG eistimation: do the estimation use the imu date(attitude & angular vel) to esimation the COG point.

Gazebo cheat mode

@tongtybj

  1. Adjust torque_force rate
  2. Increase joint velocity
  3. Add yaw velocity control in /uav/uav topic (currently missing)

hydrus::ServoStateのloadをcurrent(int16_t)に変更してほしい

@tongtybj
dynamixel 2.0にするにあたって8bitだったloadが(符号付)16bitのcurrentになったので変更お願いします.
aerial_robot/robots/hydrus/src/dynamicxel_bridge.cpp内でこの部分を使っているようなので対応お願いします.
名前をcurrentとloadどちらにするかはお任せします.
あと,Haloのdynamixel 2.0へのアップデートが間に合わないかもしれないので,1つコミットをrevertすればよいように一つのコミットでお願いいたします.

hydrusの残りタスク

CoGベースでの状態推定、飛行制御が完成し、range sensor + optical flowによる自律飛行が完成した。
Hydrusでの残りタスクは:

1. waypoint navigationの実機テスト、whole body aerial manipulationでの確認は必須
2. 機体処理機、センサー系のアップデート: intel euclid(side view) + tx1 +downward pointgrey camera
3. ekf attitude estiamtionの実機テスト、rosbagでも可
4. baselink -> cogの速度成分の計算時に使うimuのgyro(角速度値)のノイズの低減(LFP強化),これはdragonでも関係してくる。

[aerial_robot_estimation] Opencv3 has different API

@chibi314

optical flowの部分はopencv2.4.xで実装したから、opencv3だと以下のエラーで出てくる。
opencvのバージョンを参照すべきかな。

$ catkin bt
==> Expanding alias 'bt' from 'catkin bt' to 'catkin b --this'
==> Expanding alias 'b' from 'catkin b --this' to 'catkin build --this'
--------------------------------------------------------------------------------------------------------------------------------------------------
Profile:                     default
Extending:          [cached] /home/chou/ros/kinetic/devel:/home/chou/ros/uav_forest_ws/devel:/home/chou/ros/aerial_robot_ws/devel:/opt/ros/kinetic
Workspace:                   /home/chou/ros/aerial_robot_ws
--------------------------------------------------------------------------------------------------------------------------------------------------
Source Space:       [exists] /home/chou/ros/aerial_robot_ws/src
Log Space:          [exists] /home/chou/ros/aerial_robot_ws/logs
Build Space:        [exists] /home/chou/ros/aerial_robot_ws/build
Devel Space:        [exists] /home/chou/ros/aerial_robot_ws/devel
Install Space:      [unused] /home/chou/ros/aerial_robot_ws/install
DESTDIR:            [unused] None
--------------------------------------------------------------------------------------------------------------------------------------------------
Devel Space Layout:          linked
Install Space Layout:        None
--------------------------------------------------------------------------------------------------------------------------------------------------
Additional CMake Args:       None
Additional Make Args:        None
Additional catkin Make Args: None
Internal Make Job Server:    True
Cache Job Environments:      False
--------------------------------------------------------------------------------------------------------------------------------------------------
Whitelisted Packages:        None
Blacklisted Packages:        None
--------------------------------------------------------------------------------------------------------------------------------------------------


--------------------------------------------------------------------------------------------------------------------------------------------------
WARNING: Your current environment's CMAKE_PREFIX_PATH is different from the cached CMAKE_PREFIX_PATH used the last time this workspace was built.

If you want to use a different CMAKE_PREFIX_PATH you should call `catkin clean` to remove all references to the previous CMAKE_PREFIX_PATH.

Cached CMAKE_PREFIX_PATH:
	/home/chou/ros/kinetic/devel:/home/chou/ros/uav_forest_ws/devel:/home/chou/ros/aerial_robot_ws/devel:/opt/ros/kinetic
Current CMAKE_PREFIX_PATH:
	/home/chou/ros/test_ws/devel:/home/chou/ros/kinetic/devel:/home/chou/ros/uav_forest_ws/devel:/home/chou/ros/aerial_robot_ws/devel:/opt/ros/kinetic
--------------------------------------------------------------------------------------------------------------------------------------------------

[build] Found '37' packages in 0.0 seconds.                                                                                                                                                                
[build] Updating package table.                                                                                                                                                                            
Starting  >>> aerial_robot_estimation                                                                                                                                                                      
___________________________________________________________________________________________________________________________________________________________________________________________________________
Warnings   << aerial_robot_estimation:cmake /home/chou/ros/aerial_robot_ws/logs/aerial_robot_estimation/build.cmake.000.log                                                                                
CMake Warning at /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/CMakeLists.txt:16 (message):
  OPENCV 3.2.0 FOUND


cd /home/chou/ros/aerial_robot_ws/build/aerial_robot_estimation; catkin build --get-env aerial_robot_estimation | catkin env -si  /usr/bin/cmake /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation --no-warn-unused-cli -DCATKIN_DEVEL_PREFIX=/home/chou/ros/aerial_robot_ws/devel/.private/aerial_robot_estimation -DCMAKE_INSTALL_PREFIX=/home/chou/ros/aerial_robot_ws/install; cd -
...........................................................................................................................................................................................................
___________________________________________________________________________________________________________________________________________________________________________________________________________
Errors     << aerial_robot_estimation:make /home/chou/ros/aerial_robot_ws/logs/aerial_robot_estimation/build.make.000.log                                                                                  
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
                 from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:432:28: error: ‘vector’ does not name a type
 CV_EXPORTS void merge(const vector<GpuMat>& src, GpuMat& dst, Stream& stream = Stream::Null());
                            ^
/usr/include/opencv2/gpu/gpu.hpp:432:34: error: expected ‘,’ or ‘...’ before ‘<’ token
 CV_EXPORTS void merge(const vector<GpuMat>& src, GpuMat& dst, Stream& stream = Stream::Null());
                                  ^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
                 from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:438:41: error: ‘vector’ has not been declared
 CV_EXPORTS void split(const GpuMat& src, vector<GpuMat>& dst, Stream& stream = Stream::Null());
                                         ^
/usr/include/opencv2/gpu/gpu.hpp:438:47: error: expected ‘,’ or ‘...’ before ‘<’ token
 CV_EXPORTS void split(const GpuMat& src, vector<GpuMat>& dst, Stream& stream = Stream::Null());
                                               ^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
                 from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1265:4: error: ‘vector’ does not name a type
    vector<Point> locations;
    ^
/usr/include/opencv2/gpu/gpu.hpp:1266:4: error: ‘vector’ does not name a type
    vector<double> confidences;
    ^
/usr/include/opencv2/gpu/gpu.hpp:1267:4: error: ‘vector’ does not name a type
    vector<double> part_scores[4];
    ^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
                 from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1285:31: error: ‘vector’ does not name a type
     void setSVMDetector(const vector<float>& detector);
                               ^
/usr/include/opencv2/gpu/gpu.hpp:1285:37: error: expected ‘,’ or ‘...’ before ‘<’ token
     void setSVMDetector(const vector<float>& detector);
                                     ^
/usr/include/opencv2/gpu/gpu.hpp:1287:12: error: ‘vector’ does not name a type
     static vector<float> getDefaultPeopleDetector();
            ^
/usr/include/opencv2/gpu/gpu.hpp:1288:12: error: ‘vector’ does not name a type
     static vector<float> getPeopleDetector48x96();
            ^
/usr/include/opencv2/gpu/gpu.hpp:1289:12: error: ‘vector’ does not name a type
     static vector<float> getPeopleDetector64x128();
            ^
/usr/include/opencv2/gpu/gpu.hpp:1291:36: error: ‘vector’ has not been declared
     void detect(const GpuMat& img, vector<Point>& found_locations,
                                    ^
/usr/include/opencv2/gpu/gpu.hpp:1291:42: error: expected ‘,’ or ‘...’ before ‘<’ token
     void detect(const GpuMat& img, vector<Point>& found_locations,
                                          ^
/usr/include/opencv2/gpu/gpu.hpp:1295:46: error: ‘vector’ has not been declared
     void detectMultiScale(const GpuMat& img, vector<Rect>& found_locations,
                                              ^
/usr/include/opencv2/gpu/gpu.hpp:1295:52: error: expected ‘,’ or ‘...’ before ‘<’ token
     void detectMultiScale(const GpuMat& img, vector<Rect>& found_locations,
                                                    ^
/usr/include/opencv2/gpu/gpu.hpp:1300:47: error: ‘vector’ has not been declared
     void computeConfidence(const GpuMat& img, vector<Point>& hits, double hit_threshold,
                                               ^
/usr/include/opencv2/gpu/gpu.hpp:1300:53: error: expected ‘,’ or ‘...’ before ‘<’ token
     void computeConfidence(const GpuMat& img, vector<Point>& hits, double hit_threshold,
                                                     ^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
                 from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1303:57: error: ‘vector’ has not been declared
     void computeConfidenceMultiScale(const GpuMat& img, vector<Rect>& found_locations,
                                                         ^
/usr/include/opencv2/gpu/gpu.hpp:1303:63: error: expected ‘,’ or ‘...’ before ‘<’ token
     void computeConfidenceMultiScale(const GpuMat& img, vector<Rect>& found_locations,
                                                               ^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
                 from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1835:5: error: ‘vector’ does not name a type
     vector<GpuMat> prevPyr_;
     ^
/usr/include/opencv2/gpu/gpu.hpp:1836:5: error: ‘vector’ does not name a type
     vector<GpuMat> nextPyr_;
     ^
/usr/include/opencv2/gpu/gpu.hpp:1838:5: error: ‘vector’ does not name a type
     vector<GpuMat> buf_;
     ^
/usr/include/opencv2/gpu/gpu.hpp:1839:5: error: ‘vector’ does not name a type
     vector<GpuMat> unused;
     ^
make[2]: *** [CMakeFiles/optical_flow.dir/src/optical_flow.cpp.o] エラー 1
make[1]: *** [CMakeFiles/optical_flow.dir/all] エラー 2
make: *** [all] エラー 2
cd /home/chou/ros/aerial_robot_ws/build/aerial_robot_estimation; catkin build --get-env aerial_robot_estimation | catkin env -si  /usr/bin/make --jobserver-fds=6,7 -j; cd -
...........................................................................................................................................................................................................
Failed     << aerial_robot_estimation:make           [ Exited with code 2 ]                                                                                                                                
Failed    <<< aerial_robot_estimation                [ 3.9 seconds ]                                                                                                                                       
[build] Summary: 0 of 1 packages succeeded.                                                                                                                                                                
[build]   Ignored:   36 packages were skipped or are blacklisted.                                                                                                                                          
[build]   Warnings:  1 packages succeeded with warnings.                                                                                                                                                   
[build]   Abandoned: None.                                                                                                                                                                                 
[build]   Failed:    1 packages failed.                                                                                                                                                                    
[build] Runtime: 4.1 seconds total.                                                                                                                                                                        
[build] Note: Workspace packages have changed, please re-source setup files to use them.

Gazebo

@KahnShi @chibi314

The basic knowledge for gazebo ros interface:

  1. ros_control in gazebo
    Basic tutorial and introduction for gazebo control.

  2. gazebo_ros_control_plugin
    The hardware interface in wide sense. Parsing transmission elements, create instance about the "hw interface sim" and "controller_manager_". Execute "readSim", "writeSim" and "controller_manager_->update".

  3. default_robot_hw_sim
    The hardware interface in narrow sense. Create the instance for specific hardware interface (i.e joint, motor twist in UAV). Define the func of "ReadSim" and "WriteSim".

  4. hector_quadcopter
    The code for rigid UAV in gazebo. The core code should be here

  • Transmission:
    Connecting specific hardware_interface(joint, motor twist in UAV) and actuator_interface, check the example

  • The connection between controller and HW.
    controlller is called from controlller_spawner, while the HW is called from gazebo_ros_control_plugin. The connection between controller and HW is based on pointer(?). Please check the hector code.
    Also check this code

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.