Giter Club home page Giter Club logo

direct_lidar_odometry's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

direct_lidar_odometry's Issues

Initialization function

Hi,
Thanks for your great work! I have been testing 'DLO' for a long time and its performance is excellent .
In some tests, due to environmental reasons such as a particularly sparse environment, DLO results are incorrect. Of course, this is not a problem for 'DLO' itself. When this situation occurs, I don't want to shut down DLO and restart it, but rather want to initialize it, just like the initialpose function in ROS.
So could you please add features like "initialpose" to 'DLO' to solve the problem of not turning off 'DLO' and continuing to run in case of inaccurate pose caused by accidents?
I apologize to you if my request is very presumptuous.
Thank you again for your great work .

Best wishes

about covariance

thank you for your excellent work,it seems that you didn't calculate covariance of source cloud in every loop, Or maybe I missed it somewhere?

Parameter Tuning Suggestions

Hi Kenny,

Thanks for your great work!
I am now using DLO as the frontend of my slam system and observed some performance drops in some open-space data sets (e.g., KITTI 00). I guess it could be the problem of parameter tuning as I assume the current parameter set is mainly designed for the DARPA Subterranean Challenge. So, could you give me some hint on how to better tune the parameters (especially the sub-mapping-related) for a better adaptation in such environments?

Drift using 2D Laser in large spaces

Hi dear,

its been few weeks that we are testing DLO to compute odometry for a mobile robot and its has great result. However, when we went to a larger space to test it, we had a lot of drift, we did some parameter tuning as suggested in the previous issue tickets concerning sparse lidars but we could not reach acceptable results. The laser measurements are in 2D and do not cover full 360 but around 270, in addition, sometimes the laser measures the ground. Do you think that its related to this ?

image

We played with the outlier rejection parameters, the max corr distance for both s2s and s2m and the adaptiveParams. An error is raised everytime in this environnement and its related to convexHull : [pcl::ConvexHull::performReconstrution3D] ERROR: qhull was unable to compute a convex hull for the given point cloud (375)!. This error does not rise in smaller places.

Thank you very much for your help !

getSubmapKeyframes()

Hello authors,
Thanks for sharing this excellent work to let us study firstly :)
In getSubmapKeyframes() we call computeConvexHull() to calculate the keyframes indice whose position(xyz) consist of convex hull, we then push the indices into keyframe_convex. Now let we suppose keyframe_convex's content is keyframe_convex = [3, 6, 8, 10, 13]. We then push the distance between curr scan position to each of keyframe_convex position one by one from begining into convex_ds. Let we suppose convex_ds is convex_ds =[1.0, 0.2, 1.4, 0.3, 3.0]. Then in pushSubmapIndices() we calculate submap_kcv_ smallest elements, which represent submap_kcv_ nearest convex keyframes indices from current scan position. Let we suppose submap_kcv_ = 2. In pushSubmapIndices distance 0.2 and 0.3 will be sorted, so we push indice = 1 and indice = 3 into submap_kf_idx_curr, but i think we should push the corresponding indices in keyframe_convex namely 6 and 10 into submap_kf_idx_curr, which are distance 0.2 and 0.3 correspond keyframe indices, right?
If i mistake somewhere code please correct me. If I'm right, the same problem also in pushSubmapIndices() of computeConcaveHull().
BTW, this->pose in L1252 is last scan position in global frame, why not use the current scan's guess value in global frame, namely this->T_s2s, we get from this->propagateS2S(T_S2S);. Maybe the robot movement is small and lidar frequency is 10hz, we can ignore this effect.
Thanks for your time and help very much in advance!

Nano-GICP branch

Hi. Thank you for sharing the great work.

I made a Nano-GICP only repository to be used as a module in other packages as here.
If you can create a new separate branch for it, I'd love to push this branch via pull request, if you like it.
(Otherwise, I cannot open a pull request to a new branch from my end)

With the module branch, Nano-GICP can be easily imported and used as follows:
CMakeLists.txt
Header
Method
Or, simply as explained in here

update key frames

thanks for fully and quickly answering last question!
now if feature info had been changed largely after mapping , the accuracy and stability of localization must be terrible.
to avoid this situation, exactly update key frames whose feature info has changed should be better solution.
so as your suggestion, how to exactly update key frames whose feature info has changed ?
thanks.

Crashes after DLO initialization

Hello,
Thanks for your work. I am trying to run your code on your test data. Although the code built without any errors I find that it's crashing in run time after DLO is initialized. Please check the attached picture for your reference.

Screenshot from 2022-07-17 13-19-30

Can you please help me fix this? I think it will be fun to use your code as the results and video seem promising.

P.S. I am running this on the dataset you have provided.

Information about my system:
OS: Ubuntu 20.04
ROS: Noetic
PCL: 1.12.1 (Installed from source)
gcc: 9.4.0 (So I assume this must take care of the OpenMP requirement, although I also did sudo apt install libomp-dev)

I do not understand where things are going wrong and I wonder if pcl-1.10 that comes with ROS is interfering with pcl-1.12.1 that I have installed from source. I discovered from further investigation that the code crashes at:

this->crop.filter(*this->current_scan);

if I disable it then it crashes at voxel filter.

Parameter tuning suggestions

I am doing research related to motion based calibration and I am using DLO for lidar odometry generation. It has worked fine so far but with my current dataset it gives non reasonable pose estimates, what are the first parameters try to tune? Figure below shows some things related to my dataset. So it is infinity shape path in a planar ground.
googlemaps_fig

sample pointclouds are like figure below. I have combined two vlp16 point clouds:
sample_pointcloud

ROS 2 support

hi, are you planning to migrate the code to ROS 2? thanks

Question about the querying efficiency.

Dear Professor, Thank you for your wonderful work on LiDAR-Odometry. One of the main contributions of this paper is its focus on improving efficiency. From my point of view, the time costs in LiDAR-Odometry primarily involve two parts: building the KD-Tree and conducting searches that rely on the pre-built KD-Tree. The purposed recycling datasture can efficiently resolve the aforementioned question, while the latter one can be solved by using the OpenMP for parallel acceleration. However, when I split out the covariance calculation into a standalone project, its performance(5-25ms) was much slower compared to the same procedure in the direct-lidar-odometry(2-8ms). I don't know what went wrong. Could you please help me check if there is any mistake in my code or if there are any tricks that I might have missed? The code is listed below, thank you very much.

  auto cov_start = omp_get_wtime();
#pragma omp parallel for num_threads(16) schedule(guided, 8)
  for (int i = 0; i < current_features->points.size(); i++) {
    std::vector<int> k_indices;
    std::vector<float> k_sq_distances;
    frame_kdtree_->nearestKSearch(current_features->at(i), k_correspondences_, k_indices, k_sq_distances);
    Eigen::Matrix<double, 4, -1> neighbors(4, k_correspondences_);
    for(int j = 0; j < k_indices.size(); ++j){
      neighbors.col(j) = current_features->at(k_indices[j]).getVector4fMap().template cast<double>();
    }
    neighbors.colwise() -= neighbors.rowwise().mean().eval();
    Eigen::Matrix4d cov = neighbors * neighbors.transpose() / k_correspondences_;
    Eigen::JacobiSVD<Eigen::Matrix3d> svd(cov.block<3, 3>(0, 0), Eigen::ComputeFullU | Eigen::ComputeFullV);
    Eigen::Vector3d values = Eigen::Vector3d(1, 1, 1e-3);
    covariances[i].setZero();
    covariances[i].template block<3, 3>(0, 0) = svd.matrixU() * values.asDiagonal() * svd.matrixV().transpose();
  }
  auto cov_end = omp_get_wtime();
  double cov_duration = (cov_end - cov_start) * 1000;
  LOG(INFO) << "The time of covariance calculation is: " << cov_duration << " ms.";

jacobian

Hello author,
I want to derive the jacobian in the code. I see we use global perturbation to update. When i try this, i meet this problem to look for your help.
image
When using Woodbury matrix identity it seems to make the formula more complicated, so i'm stuck to look for your help. Thanks for your help in advance!

Best regards
Xiaoliang jiao

Coordinate Frame

Hi,

Thank you for sharing this great work.

I've set of problem such as defining the coordinate system of the odometry . Which coordinate frame do you use for odometry NED or BODY. I thought you use NED coordinate frame so that I expected when I move Lidar + IMU to the North direction, x should be positive increased but it has not been stable behaviour. I also moved the system to the East direction it should be caused positive increase on y axis but same problem. I think you use BODY frame or something different coordinate frame for odometry.

I wonder also how do you send odometry information to the autopilot of drone. I prefered mavros and selected the related odometry ros topic and send them without change so I saw the odometry message in the autopilot but when odometry receive to the autopilot of drone it looks like x and y exchanged in autopilot even they are not exchanged odometry output of direct lidar odometry.

Summary of the Question is that which coordinate frame do you use for odometry ?
How can I set up the odometry frame to NED?
x and y are negative decreased when I move lidar+imu to the North and the East respectively. and Z looks like 180 reversed.
Should I multiply x and y with minus or apply rotation matrix to fix negative decreasing x, y odometry ?

How can I ensure that autopilot and your odometry coordinate plane are in the same coordinate plane ?

gicp speed

I run source code each scan's gicp process just cost 2-3 ms, when i run my own code use gicp library it cost 100-200ms. Both Input cloud's size almost same, which setting options may cause this phenomenon?

Coordinate system conversion

Hi, thanks for your great work. I used my lidar and imu to record the data set and run it in DLO. The following figure is the map generated by the algorithm. I found that IMU drift is very serious. Is this related to coordinate system transformation? How can I set up and configure files to improve this situation?
d0724d53975aa58f749b1abfa692456
3ae5711ad4dc74f03c683af98bc7db9

when i run "catkin build",it faild and the error logs is following:

[ 88%] Linking CXX executable /home/swc/study/slam_study/catkin_ws/devel/lib/direct_lidar_odometry/dlo_odom_node
/usr/bin/ld: warning: libboost_thread.so.1.65.1, needed by /usr/lib/x86_64-linux-gnu/libpcl_common.so, may conflict with libboost_thread.so.1.57.0
/usr/bin/ld: warning: libboost_chrono.so.1.65.1, needed by /opt/ros/melodic/lib/libroscpp.so, may conflict with libboost_chrono.so.1.57.0
/usr/bin/ld: warning: libboost_filesystem.so.1.65.1, needed by /opt/ros/melodic/lib/libroscpp.so, may conflict with libboost_filesystem.so.1.57.0
/usr/bin/ld: warning: libboost_regex.so.1.65.1, needed by /opt/ros/melodic/lib/librosconsole.so, may conflict with libboost_regex.so.1.57.0
/usr/bin/ld: warning: libboost_system.so.1.57.0, needed by /home/swc/anaconda3/envs/badgr/lib/libboost_filesystem.so, may conflict with libboost_system.so.1.65.1
/usr/bin/ld: warning: libicuuc.so.54, needed by /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so, not found (try using -rpath or -rpath-link)
/usr/bin/ld: warning: libicui18n.so.54, needed by /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so, not found (try using -rpath or -rpath-link)
/usr/bin/ld: warning: libicudata.so.54, needed by /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so, not found (try using -rpath or -rpath-link)
/usr/bin/ld: warning: libpcl_filters.so.1.9, needed by /home/swc/anaconda3/envs/badgr/lib/libpcl_segmentation.so, may conflict with libpcl_filters.so.1.8
/usr/bin/ld: warning: libpcl_search.so.1.9, needed by /home/swc/anaconda3/envs/badgr/lib/libpcl_segmentation.so, may conflict with libpcl_search.so.1.8
/usr/bin/ld: warning: libpcl_common.so.1.9, needed by /home/swc/anaconda3/envs/badgr/lib/libpcl_segmentation.so, may conflict with libpcl_common.so.1.8
/usr/bin/ld: warning: libboost_filesystem.so.1.72.0, needed by /home/swc/anaconda3/envs/badgr/lib/libpcl_recognition.so, may conflict with libboost_filesystem.so.1.57.0
/usr/bin/ld: warning: libboost_thread.so.1.72.0, needed by /home/swc/anaconda3/envs/badgr/lib/libpcl_io.so, may conflict with libboost_thread.so.1.57.0
/usr/bin/ld: warning: libboost_iostreams.so.1.72.0, needed by /home/swc/anaconda3/envs/badgr/lib/libpcl_io.so, may conflict with libboost_iostreams.so.1.57.0
/home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to u_tolower_54' /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to icu_54::Locale::Locale(icu_54::Locale const&)'
/home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to icu_54::Locale::Locale()' /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to icu_54::Locale::~Locale()'
/home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to u_charFromName_54' /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to u_isblank_54'
/home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to icu_54::Collator::createInstance(icu_54::Locale const&, UErrorCode&)' /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to u_charType_54'
/home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to u_digit_54' /home/swc/anaconda3/envs/badgr/lib/libboost_regex.so: undefined reference to u_isspace_54'
collect2: error: ld returned 1 exit status
direct_lidar_odometry/CMakeFiles/dlo_odom_node.dir/build.make:364: recipe for target '/home/swc/study/slam_study/catkin_ws/devel/lib/direct_lidar_odometry/dlo_odom_node' failed
make[2]: *** [/home/swc/study/slam_study/catkin_ws/devel/lib/direct_lidar_odometry/dlo_odom_node] Error 1
CMakeFiles/Makefile2:666: recipe for target 'direct_lidar_odometry/CMakeFiles/dlo_odom_node.dir/all' failed
make[1]: *** [direct_lidar_odometry/CMakeFiles/dlo_odom_node.dir/all] Error 2
Makefile:145: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j8 -l8" failed

Covariances Calculation in updatekeyframe

great job!

i found covariances was calculated in function updatekeyframe();
and i am not sure if
图片
was executed after function updatekeyframe(), the time of covariances calculation in function updatekeyframe() maybe saved?

Bug in adaptive threshold.

Thank you for your wonderful work, but it seems there is a little bug in the adaptive threshold estimation:
From the paper, we know "Thus, we choose to scale the translational threshold for new keyframes according to the “spaciousness” in the instantaneous point cloud scan, defined as $m_k = α m_{k−1} + β M_k$, where M_k is the median Euclidean point distance from the origin to each point in the preprocessed point cloud, α = 0.95, β = 0.05,". However, the code in odom.cc for calculation "spaciousness" is:

  std::nth_element(ds.begin(), ds.begin() + ds.size()/2, ds.end());
  float median_curr = ds[ds.size()/2];
  static float median_prev = median_curr;
  float median_lpf = 0.95*median_prev + 0.05*median_curr;
  median_prev = median_lpf;

It seems it always uses the median distance as "spaciousness".

how to set a different initialpose?

Hi,
thanks for your team's great work.And i have tested it with my own bag, the result is very close for my ground truth.
There is a request that i want to set the initial pose ,such as (100,100,0,0,0,1,0.026). But it didn't work,the "robot/dlo/odom" has always begin with (0,0,0,0,0,0,1). I just changed the code in "odom.cc",which is:
"
this->odom.pose.pose.position.x = 3972.02;
this->odom.pose.pose.position.y = 3497.76;
this->odom.pose.pose.position.z = 0;
this->odom.pose.pose.orientation.w = 0.026;
this->odom.pose.pose.orientation.x = 0.;
this->odom.pose.pose.orientation.y = 0.;
this->odom.pose.pose.orientation.z = 1;
this->odom.pose.covariance = {0.};

/*
this->odom.pose.pose.position.x = 0.;
this->odom.pose.pose.position.y = 0.;
this->odom.pose.pose.position.z = 0.;
this->odom.pose.pose.orientation.w = 1.;
this->odom.pose.pose.orientation.x = 0.;
this->odom.pose.pose.orientation.y = 0.;
this->odom.pose.pose.orientation.z = 0.;
this->odom.pose.covariance = {0.};
*/
this->T = Eigen::Matrix4f::Identity();
this->T_s2s = Eigen::Matrix4f::Identity();
this->T_s2s_prev = Eigen::Matrix4f::Identity();

this->pose_s2s = Eigen::Vector3f(0., 0., 0.);
this->rotq_s2s = Eigen::Quaternionf(1., 0., 0., 0.);

//this->pose = Eigen::Vector3f(0., 0., 0.);
this->pose = Eigen::Vector3f(3972.02, 3497.76, 0.);
//this->rotq = Eigen::Quaternionf(1., 0., 0., 0.);
this->rotq = Eigen::Quaternionf(0.026, 0., 0., 0.);
"
I would really appreciate it if someone could help me.

Map generation

Hi,
How is the map generated? What algorithm is used? Can we upload a pcd map of the environment and then apply dlo just for localisation?
I assumed dlo as a localisation algorithm, but it seems to take in the point cloud data from the bag file and generate a map on its own.

Remove "namespace" attribute of the published topics and .pcd file connot be open.

Hi, first thanks by share this fantastic work. I would like to use the launch's option to rename the topics directly with the name other packages are waiting, and it is nice like it is , only problem is the odometry node and its namespace addition will change the topics name to something the target package will not recognise, so it limit the usability of the topics rename option. Is there any way to remove the namespace option?
it is a minor issue for most cases, or even not a issue, at all, but can be a problem to use i.e. mavros.

Other issue is save the pointclouds, it works from rqt service caller changing the folder name in the name's field, but later I cannot visualise the pointcloud in Cloud compare or Metashape what makes a indication that the .pcd is not being made correctly. Maybe is just my setup, if it works for you for sure it will work for anyone, I'm using a pointcloud conversion, maybe is that.

As improvements and as you use in real robots, I would ask by a ground true topic and publish a path topic for odometry and ground true.

The performance is fantastic in real rover with a Celeron 4 cores 2Ghz, just brilliant. Congratulations.
God bless you all by share knowledge.

Pose relative to initial pose of the lidar sensor.

Thank you for this amazing work and for making it open-source! I am new in ROS and I have a question regarding how to get the pose at each frame relative to the initial pose of the lidar sensor, (initial pose 4x4 I)? Is /robot/dlo/odom_node/pose topic relative to what? Also when I test with my own bag that contains 363 pointcloud frames, I only get 359 poses when using rostopic echo /robot/dlo/odom_node/pose. Is this intentional?

Hardware architecture

Hi. I have a NVIDIA NX board, which is based on ARM architecture. I encountered the following problems when compiling on the NX board.
图片

Can this package run under arm architecture? Is this error related to the architecture? How to solve this issue?

large drift in z-axis

Thanks a lot for open sourcing such a great work! But I found during the experiment that the odometry drifts a lot in the z-axis estimation. I walked about 600m with the Lidar in hand, and when I returned to the starting point, I could see that the created point cloud map would gradually become warped. Is there any good way to solve it, such as parameter adjustment or exercise mode change? The Lidar I use is Livox MID360. Looking forward to hearing from you.
Screenshot from 2023-06-16 16-25-34
Screenshot from 2023-06-16 16-15-48

Computational speed and efficieny of DLO

Hello!
Thanks for the amazing work! This is really useful. I am currently trying to run your program with a single core (no multi threading with OpenMP as done in NanoGICP code). However, the CPU Usage and RAM Allocation is really high. I wonder if this is because the keyframes are continuously stored and no keyframes are ever discarded in this->keyframes. I understand that storing all keyframes is helpful to retrieve far back frames for convex/concave hull and also for nearest neighbors in case of a loop maneuver for example but for computation purposes, I was wondering if there are scenarios that you have tested on where the inclusion of convex and concave hull and/or addition of nearest neighbor keyframes gave you a major improvement in the output (map and trajectory) in comparison to taking the latest k frames for example (allowing me to discard old keyframes and just keep the past few keyframes in log). Do you have any other recommendations/suggestions for running the algorithm on one core ? P.S. I have tried different downsampling for the source cloud and the submap but with not much luck. Thanks in advance!

Loop closure

Hey this is more of a question than a issue.
I wanted to ask if your system does any kind of Loop closure to make sure that for big maps where you revisit a certain location after a time the drift of the map is minimal.

As in the videos or GIFs we never see a "drifted" map which gets visually corrected, but all maps look very precise i was wondering if the algorithms drift is that small or if there is some loop closure happening.

Greetings and thanks for the great work
Sven

how to localize on a given pcd map?

i have a trouble understanding how to localise on a given 3d pcd map, can anyone explain the steps

i have a pcd map which i can load into rviz, and i have lidar-points and imu data, then how to use this package for localization?

How to improve the publish frequency of odom?

Hello great author, the point cloud publish frequency of my lidar is 10hz, at this time the frequency of odom output from DLO is also 10hz, is there any way to increase the publish frequency of odom, I have some work that needs high frequency pose estimation.thanks!

Relative gyro propagation quaternion dynamics

Hi, Thank you for your excellent work!
my question:
The implementation of calculating quaternion from gyro measurements in the code(void dlo::OdomNode::integrateIMU()) is inconsistent with that in the paper. Please help me solve it. thank you.

the transformation between your lidar and imu?

I use a vlp16 lidar and a microstran imu(3DM-GX5-25) to run your program. But I failed. I think my transformation between imu and lidar is different from yours. So could you tell your true transformation between imu and lidar. the following pitcure is my configuration:
453670779
Thank you very much!

a little problem

Hi Author,
I just wanted to let you know that I truly appreciate your work. It has been useful, and it's also very simple and easy to extend. Thank you for making it open source.
However, I did notice a small small small bug in the function pushSubmapIndices() when running in Debug mode. If dists is empty, pq.top() becomes unreachable. In Release mode, the compiler can fix this issue. Perhaps it would be possible to add some code to avoid this problem altogether?
Best regards.
Afei

Usage of LIDAR intensities

Hi,

First, congrats for this great work ! its really amazing even for 2D plane LIDARs.

I have a question regarding the usage or not of the intensities information from the lidars in your GICP implementation ?

Thank you very much !

About IMU Preintegration!

Hi authors, I read your code and in the pre-integration part, IMU is used to estimate the rotation only, but not the translation. Is it because your application scenario is a low-speed environment and does not require translation prediction?

Conversion of lidar and IMU coordinate system

Hi! I use a vlp16 lidar and a microstran imu(3DM-GX5-25) to record bags. And here is the part of the map I built with your algorithm. Because the coordinate systems of IMU and lidar are not consistent in the physical direction, this has affected the construction of the two-layer environment. Could you please tell me how to convert IMU and lidar coordinate system?
图片

some trivial questions

Hello authors,
I have some trivial questions to look for your help. Thanks for your help.

How to correct point cloud caused by motion?

Hi, thank you for your great work about DLO. I have a question. When the robot is moving fast, the lidar point cloud will generate distortion due to movement, which will cause negative impact on the construction of the map. So how did DLO deal with this problem?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.