Difference between revisions of "Bambi Wiki"

From Bambi
Jump to: navigation, search
m (Project LOG)
m (Collection of related stuff that may be useful)
Line 84: Line 84:
 
* [https://lirias.kuleuven.be/bitstream/123456789/490846/1/VISAPP_15_DHU.pdf How to choose the best embedded processing platform]
 
* [https://lirias.kuleuven.be/bitstream/123456789/490846/1/VISAPP_15_DHU.pdf How to choose the best embedded processing platform]
 
* [https://dashcamtalk.com/forum/threads/xiaomi-yi-camera-gui-control-configure-from-pc-win-lin-mac.11206 Gui and py script to control YI Cam] (e.g. shoot trigger)
 
* [https://dashcamtalk.com/forum/threads/xiaomi-yi-camera-gui-control-configure-from-pc-win-lin-mac.11206 Gui and py script to control YI Cam] (e.g. shoot trigger)
 +
** See also [https://gist.github.com/SkewPL/f57e6cff7fa14601f6b256926aa33437 https://gist.github.com/SkewPL/f57e6cff7fa14601f6b256926aa33437]
 
* [https://larrylisky.com/2016/11/24/enabling-raspberry-pi-camera-v2-under-ubuntu-mate/ how to configure PI cam on ubuntu]
 
* [https://larrylisky.com/2016/11/24/enabling-raspberry-pi-camera-v2-under-ubuntu-mate/ how to configure PI cam on ubuntu]
 
* [https://link.springer.com/chapter/10.1007/978-3-319-50835-1_49 ETHZ Detection and tracking of possible human victims using thermal and visual cameras in real time]
 
* [https://link.springer.com/chapter/10.1007/978-3-319-50835-1_49 ETHZ Detection and tracking of possible human victims using thermal and visual cameras in real time]

Revision as of 18:17, 9 September 2018

Welcome to the documentation Wiki of Bambi

An introduction needs still to be made. Some links to youtube videos of similar projects:

BAMBI INSTALLATION REQUIREMENTS

  1. Install script used for installation (also Gazebo, which is not necessary on RaspberryPI): [https://dev.px4.io/en/setup/dev_env_linux.html#gazebo-with-ros]
  2. KML Parser:
    1. sudo apt install libxml2-dev libxslt-dev python-dev
    2. sudo pip install lxml
    3. sudo pip install pykml

Project Bias Point

The projects aims to develop an automated solution for the very specific use-case. The already developed and highly stressed concept of UVA is the starting point of the project. So we won't develop new techniques regarding these basic issues. Therefore we start with a DRONE that is able to:

  1. Hold a desired altitude
  2. Keep / fly to a desired GPS position
  3. Compensate external disturbances
  4. Provide ground communication for troubleshooting and development


To achieve this we use the following basic hardware:

  • Pixhawk flashed with PX4 flight stack (a featuring open source flight controller)
  • NEO-M8n (GPS)
  • 3DR telemetry radio 433Hz (to have a serial link to the ground station and communicate to the flight board trough Mavlink protocol)
  • Video TX Ts5823/5828 (300/600mW video transmitter 5.8GHz)
  • Video RX 5.8gHz av output or otg (uvc)
  • OBSOLETE Sonar (WT81B003-0202 (Ultrasonic long distance sensor))
  • LidarLite V3 (altitude distance sensor) (Datasheet)
  • TFmini lidar (Datasheet)

Mimmo

Meetings:

  1. 2017-07-04 Skype Call
  2. 2017-11-15 Meeting 1
  3. 2018-07-0x Meeting 2

Simulation

  1. Simulation with Gazebo + PX4 SITL + ROS

Project Milestones

  1. Georeferenced Orthophoto
  2. Image Processing (obtain the contour of the field)
  3. Automated Flight Route Planner
  4. Obstacle Avoidance Flight Mode
  5. Thermal Camera Processing
  6. Bambi Saving Workflow (e.g. mobile apps)

Project LOG

Intensive project activity log (see Activity Log):

  1. Day 1
  2. Day 2
  3. Day 3
  4. Day 4
  5. Day 5
  6. Day 6
  7. Day 7
  8. Day 8
  9. Day 9
  10. Day 10
  11. Day 11
  12. Day 12
  13. Day 13
  14. Day 14
  15. For the last 3 days, see Activity Log

See also Longterm TODOs

Collection of related stuff that may be useful

Getting started with MediaWiki