jsk-ros-pkg / jsk_aerial_robot Goto Github PK
View Code? Open in Web Editor NEWThe platfrom for aerial robot (e.g. general multirotor, hydrus, di, dragon, etc)
Home Page: http://www.jsk.t.u-tokyo.ac.jp/index-j.html
The platfrom for aerial robot (e.g. general multirotor, hydrus, di, dragon, etc)
Home Page: http://www.jsk.t.u-tokyo.ac.jp/index-j.html
Eigen::MatrixXd getU()
を使ってください。
right now, the point of uav odometry is ambiguous:
#55 unified the point of odom to root_link.
And we have to add the publisher for the point of uav CoG.
@tongtybj
Currently, we have link2's global position, velocity, orientation /uav/state
and joint's angle, angle velocity /hydrus3/joint_states
being published.
If we could also publish all the links' position, velocity, orientation, it would be much more convenient.
The tree database starts as soon as the first detection.
Change it to the time when uav is closed to the first tree.
On 2017/5/12, the gps based flight control is tested.
Although the flight was stable, there was a bug that the flight is based on the pos_world_based_control_mode
.
So the position control was using the low accurate position value estimated from gps.
It is considered that the velocity estimate from gps was more reliable than the latitude/longitude data from gps. This bug is fixed by commit : 5fe1c71
Therefore we have to check the vel_world_based_control_mode
again with the same indoor gain(0.4)
On the other hand, the position control demonstrated the stable flight to some degree even under the strong wind environment. The indicates the feasibility of the pos-control based on the GPS. Test the waypoint code.
CoGベースでの状態推定、飛行制御が完成し、range sensor + optical flowによる自律飛行が完成した。
Hydrusでの残りタスクは:
1. waypoint navigationの実機テスト、whole body aerial manipulationでの確認は必須
2. 機体処理機、センサー系のアップデート: intel euclid(side view) + tx1 +downward pointgrey camera
3. ekf attitude estiamtionの実機テスト、rosbagでも可
4. baselink -> cogの速度成分の計算時に使うimuのgyro(角速度値)のノイズの低減(LFP強化),これはdragonでも関係してくる。
As mentioned in this paper,
the angular velocity of the uav in {COG}
frame is approximated equal to the differential result of Euler angles. This is based on the assumption that the {COG}
attitude is around the level state.
quadタイプでの実験中で、複数のプロテクターリングとプロペラが破損した。
交換が必要。
#76 の2と同時に行う
Adding velocity control in z axis is needed in hebi motion planning.
Right now, the offset is calculated from the sensor value with certain duration in the initial phase:
https://github.com/EESC-LabRoM/labrom_optical_flow/blob/master/src/lucas_kanade.cpp
But, this method is not suitable for the sensor which is too closed to the ground and the value is insane,
So we have to set the offset as a constant parameter.
The kinematics will change by both desire coordinate and the joint states.
The best way is to update the kinematics individually, but the current code only update by joint states.
Right now, it is OK since the update rate of LQI is slower than joint states and desire coordinate,
LQI < joint states = desire coordinate. But it will potential problem.
YawのD項はd_boardの方で行っていて、
その上限はかなり恣意的に決められている。
https://github.com/tongtybj/d_board/pull/21
今は3[N]相当になっているが、gazebo上で変形するときにガクンと落ちるのはこのD項が上限にぶつかってしまっているためである。この問題に対しては、まず変形の速度を落とせばいい。
We have been using the first way, which may cause the sudden falling down during the aerial transformation, however, the second one may cause the x/y position offset during the transformation.
Trade off.
GPS:
test: acc without bias estimation
test form: o, s(non yaw)
test control mode: vel (or pos)
Optical Flow
test: kalman filter
test form: same
test control mode: vel
VO
test: kalman filter
test form: same
test control mode: pos
aerial_robot_comm is based on mavlink with indigo version, so right now we fix the version of mavlink by adding the repository link in .rosinstall #20. In the future, we have to update code in aerial_robot_comm to following mavlink with kinetic version.
問題:
optical flowやGPSは処理に時間がかかるため、
update時はimuを使ったpredictプロセスより遅れてくる. それでfusionが悪くなっている可能せいがある。
解決:
ethzasl_sensor_fusionはtimestampの同期以外にも、より正確なモデルを使っている(#49)
足の部分等がおかしい。対になっているものが対象な位置でなく同じ側に表示されている?
見栄えがあまりよくないのでそのうち直したい。
@KahnShi @chibi314
I found the way to add Travis for the private repository,
so I will move the repo to be private again in few days.
Right now, the sensor heartbeat is associated to the force landing mode.
However, the better way is to switch mode into lower level flight mode like attitude control and then shift to landing mode.
@KahnShi
Please check following code:
https://github.com/tongtybj/aerial_robot/blob/devel/robots/hydrus/src/transform_control.cpp#L523-L633
hydrus_hexにカメラがついたモデルを作りたい時、hydrus_hex_with_camera.urdf.xacroを作ることになるが、こうするとaerial_robot_model.launchでロードするときにtype:=hydrus_hex_with_cameraとするだけでは$(arg type)_init.yamlなどにも対応する必要があるためロードできない。
このために各ファイルを作りなおすのはよくない(同内容のファイルが大量にできるため)。
@tongtybj
dynamixel 2.0にするにあたって8bitだったloadが(符号付)16bitのcurrentになったので変更お願いします.
aerial_robot/robots/hydrus/src/dynamicxel_bridge.cpp内でこの部分を使っているようなので対応お願いします.
名前をcurrentとloadどちらにするかはお任せします.
あと,Haloのdynamixel 2.0へのアップデートが間に合わないかもしれないので,1つコミットをrevertすればよいように一つのコミットでお願いいたします.
The behaviors of hydrusx quad in the gazebo simulation is wired:
Check the reason
前回の実験で、いくつかパラメータが変更されいて、今は
https://github.com/tongtybj/aerial_robot/tree/hydrus-octo
にプッシュしている。
optical flowの部分はopencv2.4.xで実装したから、opencv3だと以下のエラーで出てくる。
opencvのバージョンを参照すべきかな。
$ catkin bt
==> Expanding alias 'bt' from 'catkin bt' to 'catkin b --this'
==> Expanding alias 'b' from 'catkin b --this' to 'catkin build --this'
--------------------------------------------------------------------------------------------------------------------------------------------------
Profile: default
Extending: [cached] /home/chou/ros/kinetic/devel:/home/chou/ros/uav_forest_ws/devel:/home/chou/ros/aerial_robot_ws/devel:/opt/ros/kinetic
Workspace: /home/chou/ros/aerial_robot_ws
--------------------------------------------------------------------------------------------------------------------------------------------------
Source Space: [exists] /home/chou/ros/aerial_robot_ws/src
Log Space: [exists] /home/chou/ros/aerial_robot_ws/logs
Build Space: [exists] /home/chou/ros/aerial_robot_ws/build
Devel Space: [exists] /home/chou/ros/aerial_robot_ws/devel
Install Space: [unused] /home/chou/ros/aerial_robot_ws/install
DESTDIR: [unused] None
--------------------------------------------------------------------------------------------------------------------------------------------------
Devel Space Layout: linked
Install Space Layout: None
--------------------------------------------------------------------------------------------------------------------------------------------------
Additional CMake Args: None
Additional Make Args: None
Additional catkin Make Args: None
Internal Make Job Server: True
Cache Job Environments: False
--------------------------------------------------------------------------------------------------------------------------------------------------
Whitelisted Packages: None
Blacklisted Packages: None
--------------------------------------------------------------------------------------------------------------------------------------------------
--------------------------------------------------------------------------------------------------------------------------------------------------
WARNING: Your current environment's CMAKE_PREFIX_PATH is different from the cached CMAKE_PREFIX_PATH used the last time this workspace was built.
If you want to use a different CMAKE_PREFIX_PATH you should call `catkin clean` to remove all references to the previous CMAKE_PREFIX_PATH.
Cached CMAKE_PREFIX_PATH:
/home/chou/ros/kinetic/devel:/home/chou/ros/uav_forest_ws/devel:/home/chou/ros/aerial_robot_ws/devel:/opt/ros/kinetic
Current CMAKE_PREFIX_PATH:
/home/chou/ros/test_ws/devel:/home/chou/ros/kinetic/devel:/home/chou/ros/uav_forest_ws/devel:/home/chou/ros/aerial_robot_ws/devel:/opt/ros/kinetic
--------------------------------------------------------------------------------------------------------------------------------------------------
[build] Found '37' packages in 0.0 seconds.
[build] Updating package table.
Starting >>> aerial_robot_estimation
___________________________________________________________________________________________________________________________________________________________________________________________________________
Warnings << aerial_robot_estimation:cmake /home/chou/ros/aerial_robot_ws/logs/aerial_robot_estimation/build.cmake.000.log
CMake Warning at /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/CMakeLists.txt:16 (message):
OPENCV 3.2.0 FOUND
cd /home/chou/ros/aerial_robot_ws/build/aerial_robot_estimation; catkin build --get-env aerial_robot_estimation | catkin env -si /usr/bin/cmake /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation --no-warn-unused-cli -DCATKIN_DEVEL_PREFIX=/home/chou/ros/aerial_robot_ws/devel/.private/aerial_robot_estimation -DCMAKE_INSTALL_PREFIX=/home/chou/ros/aerial_robot_ws/install; cd -
...........................................................................................................................................................................................................
___________________________________________________________________________________________________________________________________________________________________________________________________________
Errors << aerial_robot_estimation:make /home/chou/ros/aerial_robot_ws/logs/aerial_robot_estimation/build.make.000.log
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:432:28: error: ‘vector’ does not name a type
CV_EXPORTS void merge(const vector<GpuMat>& src, GpuMat& dst, Stream& stream = Stream::Null());
^
/usr/include/opencv2/gpu/gpu.hpp:432:34: error: expected ‘,’ or ‘...’ before ‘<’ token
CV_EXPORTS void merge(const vector<GpuMat>& src, GpuMat& dst, Stream& stream = Stream::Null());
^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:438:41: error: ‘vector’ has not been declared
CV_EXPORTS void split(const GpuMat& src, vector<GpuMat>& dst, Stream& stream = Stream::Null());
^
/usr/include/opencv2/gpu/gpu.hpp:438:47: error: expected ‘,’ or ‘...’ before ‘<’ token
CV_EXPORTS void split(const GpuMat& src, vector<GpuMat>& dst, Stream& stream = Stream::Null());
^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1265:4: error: ‘vector’ does not name a type
vector<Point> locations;
^
/usr/include/opencv2/gpu/gpu.hpp:1266:4: error: ‘vector’ does not name a type
vector<double> confidences;
^
/usr/include/opencv2/gpu/gpu.hpp:1267:4: error: ‘vector’ does not name a type
vector<double> part_scores[4];
^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1285:31: error: ‘vector’ does not name a type
void setSVMDetector(const vector<float>& detector);
^
/usr/include/opencv2/gpu/gpu.hpp:1285:37: error: expected ‘,’ or ‘...’ before ‘<’ token
void setSVMDetector(const vector<float>& detector);
^
/usr/include/opencv2/gpu/gpu.hpp:1287:12: error: ‘vector’ does not name a type
static vector<float> getDefaultPeopleDetector();
^
/usr/include/opencv2/gpu/gpu.hpp:1288:12: error: ‘vector’ does not name a type
static vector<float> getPeopleDetector48x96();
^
/usr/include/opencv2/gpu/gpu.hpp:1289:12: error: ‘vector’ does not name a type
static vector<float> getPeopleDetector64x128();
^
/usr/include/opencv2/gpu/gpu.hpp:1291:36: error: ‘vector’ has not been declared
void detect(const GpuMat& img, vector<Point>& found_locations,
^
/usr/include/opencv2/gpu/gpu.hpp:1291:42: error: expected ‘,’ or ‘...’ before ‘<’ token
void detect(const GpuMat& img, vector<Point>& found_locations,
^
/usr/include/opencv2/gpu/gpu.hpp:1295:46: error: ‘vector’ has not been declared
void detectMultiScale(const GpuMat& img, vector<Rect>& found_locations,
^
/usr/include/opencv2/gpu/gpu.hpp:1295:52: error: expected ‘,’ or ‘...’ before ‘<’ token
void detectMultiScale(const GpuMat& img, vector<Rect>& found_locations,
^
/usr/include/opencv2/gpu/gpu.hpp:1300:47: error: ‘vector’ has not been declared
void computeConfidence(const GpuMat& img, vector<Point>& hits, double hit_threshold,
^
/usr/include/opencv2/gpu/gpu.hpp:1300:53: error: expected ‘,’ or ‘...’ before ‘<’ token
void computeConfidence(const GpuMat& img, vector<Point>& hits, double hit_threshold,
^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1303:57: error: ‘vector’ has not been declared
void computeConfidenceMultiScale(const GpuMat& img, vector<Rect>& found_locations,
^
/usr/include/opencv2/gpu/gpu.hpp:1303:63: error: expected ‘,’ or ‘...’ before ‘<’ token
void computeConfidenceMultiScale(const GpuMat& img, vector<Rect>& found_locations,
^
In file included from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/include/aerial_robot_estimation/optical_flow.h:61:0,
from /home/chou/ros/aerial_robot_ws/src/aerial_robot/aerial_robot_estimation/src/optical_flow.cpp:1:
/usr/include/opencv2/gpu/gpu.hpp:1835:5: error: ‘vector’ does not name a type
vector<GpuMat> prevPyr_;
^
/usr/include/opencv2/gpu/gpu.hpp:1836:5: error: ‘vector’ does not name a type
vector<GpuMat> nextPyr_;
^
/usr/include/opencv2/gpu/gpu.hpp:1838:5: error: ‘vector’ does not name a type
vector<GpuMat> buf_;
^
/usr/include/opencv2/gpu/gpu.hpp:1839:5: error: ‘vector’ does not name a type
vector<GpuMat> unused;
^
make[2]: *** [CMakeFiles/optical_flow.dir/src/optical_flow.cpp.o] エラー 1
make[1]: *** [CMakeFiles/optical_flow.dir/all] エラー 2
make: *** [all] エラー 2
cd /home/chou/ros/aerial_robot_ws/build/aerial_robot_estimation; catkin build --get-env aerial_robot_estimation | catkin env -si /usr/bin/make --jobserver-fds=6,7 -j; cd -
...........................................................................................................................................................................................................
Failed << aerial_robot_estimation:make [ Exited with code 2 ]
Failed <<< aerial_robot_estimation [ 3.9 seconds ]
[build] Summary: 0 of 1 packages succeeded.
[build] Ignored: 36 packages were skipped or are blacklisted.
[build] Warnings: 1 packages succeeded with warnings.
[build] Abandoned: None.
[build] Failed: 1 packages failed.
[build] Runtime: 4.1 seconds total.
[build] Note: Workspace packages have changed, please re-source setup files to use them.
The stl/dae file is old, please update
Two cods are complicatedly associate with each other, it is chaos.
I will clean up those two codes after merge the outdoor
and simulation
branch.
Right now the p term becomes very big in the landing phase, if the UAV hovers at a relatively high altitude(e.g 3m).
https://github.com/tongtybj/aerial_robot/blob/devel/aerial_robot_base/src/flight_control.cpp#L547
Solution1:
Set the threshold for the d_err_pos_curr_throttle_
Solution2:
Set the incremental diff for the landing phase (maybe the takeoff phase also needs)
コンパイルの手順をできるだけ簡潔かつ自動にしてみた。
travisも導入してみた。
ホームページを確認してみてぞい。
https://github.com/tongtybj/aerial_robot
これでコンパイルのビルドの自動チェックが可能になったけど、
問題はtracisはpublicのリポジトリにしか対応しないらしい。
privateだと有料になる。
@KahnShi
root_link
is the link attached with the main flight controller and IMU board. Therefore, the root link is usually the middle link in the real machine. The root link play the important role in both attitude estimation and flight control.
I thought there is a bug in my code, since the UAV can not fly if we set different root link from default setting.
In fact, there is no bug. The point is that the root link counted from 1
but not 0
, which is based on the description rule of URDF.
two parameters should modified when you want to change the root link:
Hydrus.yaml: the config file for the transform control. Note that this param is also important for real machine. This parameter can be considered as the control part.
SimulationControl.yaml: the config file for the gazebo parameter. This param is responsible for the location of virtual IMU board in the gazebo env.
Two parameters should be identical, e.g. if we set the first link as the root link,
please set 1
to both parameters.
@tongtybj
とりあえず8リンクのHydrus(まずは開リンク,閉リンク化はあとで考える)を作って
roslaunch hydrus_transform_control hydrusx_bringup.launch simulation:=True headless:=False model:=octa_closed
したら,transform controllerが落ちます.
printfデバッグで,おそらくhamiltonMatrixSolverの中で死んでいるということまでわかりました
今はlocal/globalが完全に分かれてていて、このようなわけかたはcontrolにも影響を及ぼしている。
本来であれば, local,global同時に推定するべきだと思う。
ETHのomni-directionalで使われている
擬似逆行列による対角化を行い。
最適制御との比較を行う。
心配: 各フィードバックゲインの調整が定性的すぎる。入力値が推力に変換したとき、推力限界に達する恐れがある。やってみないとわからない。
Time synchronizable estimation model
Can give a better result for the measurement sensor value which has delay (e.g. Optical Flow, GPS)
Please check bd4da61
XY Roll/Pitch Bias EKF model
40455a add new EKF model which estimates [pos_x, vel_x, pos_y, vel_y, roll_bias, pitch_bias] at the same time. Combine with
@KahnShi @chibi314
The basic knowledge for gazebo ros interface:
ros_control in gazebo
Basic tutorial and introduction for gazebo control.
gazebo_ros_control_plugin
The hardware interface in wide sense. Parsing transmission elements, create instance about the "hw interface sim" and "controller_manager_". Execute "readSim", "writeSim" and "controller_manager_->update".
default_robot_hw_sim
The hardware interface in narrow sense. Create the instance for specific hardware interface (i.e joint, motor twist in UAV). Define the func of "ReadSim" and "WriteSim".
hector_quadcopter
The code for rigid UAV in gazebo. The core code should be here
Transmission:
Connecting specific hardware_interface(joint, motor twist in UAV) and actuator_interface, check the example
The connection between controller and HW.
controlller is called from controlller_spawner, while the HW is called from gazebo_ros_control_plugin. The connection between controller and HW is based on pointer(?). Please check the hector code.
Also check this code
状態量は 6DOFの**を2階微分まで格納しており、Baselink(FCU/IMU attached)とCoG座標でそれぞれ記述している。さらに、egomotion, experiment, ground_truthの3モードに分けている。詳細はbasic_state_estimation.hを確認すべし。
基本、状態推定はBaselink座標で行い、その結果とkinematicsを用いて、CoG座標での**を求める。飛行制御はCoG座標で行うようにしている。
タスク:
aerial_robot_estimationに移す
現在の状態推定器はrosベースであるため、リアルタイム性は保障されているない。加えて、複数のプロセスからアクセスがあるため、mutexを使っているが、その排他処理も雑である。もっといい方法を考える。
いまは、yawの速度=0で過程で、この非線形項を無視しているが、
フィードフォーワード的に相殺することができる。
今はpos_vel_acc_biasをベースでやっているが, 理論上あまり厳密ではない。
より良いモデルはethzasl_sensor_fusionで提供されている。こっちをちゃんと使うべき.
After using attitude control for robot, we could not switch into position or velocity mode.
The motor number for the general PID control in the flight control is set to 1
For exmaple:
https://github.com/tongtybj/aerial_robot/blob/devel/aerial_robot_base/src/flight_control.cpp#L522-L524
We need set the correct motor num.
Add the global or head attitude control(navigation by joy), recently we only have cog attitude control.
今は静的変形を過程で、
FCU_frameのyaw, position = CoG_frameのyaw.position
になっているけど、厳密ではない。
特にsimulation/実機でo -> lに変形中、最後の時、yawは大きくズレル。
たぶんここの問題だが、
そもそも1. COG_frameの向きを慣性主軸方向に合わせる必要はないうえに、
2. 非線形なジャイロ項が補償されていないから
この問題がおきているのかも。
The so-called feedforward part in LQI control method is definitely same with the PID feedback control scheme.
https://github.com/tongtybj/aerial_robot/blob/devel/aerial_robot_base/src/flight_control.cpp#L592-L617
No need to separate the target value part, merge to the general feedback part.
The impact to the servo when crushing to the ground is very big, and may broke the motor inside.
So we have to turn off the servo and shift to the passive mode as quick as possible in this emergency case.
Look up to the sensor feedback for the imu and also the altitude velocity, as well as the force landing flag.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.