(Project LOG)
m (BAMBI INSTALLATION REQUIREMENTS)
 
(12 intermediate revisions by 2 users not shown)
Line 8: Line 8:
 
* [https://www.youtube.com/watch?v=UJxRuLogFrE&t=214s Rheingau Taunus]
 
* [https://www.youtube.com/watch?v=UJxRuLogFrE&t=214s Rheingau Taunus]
  
== Project Requirements ==
+
== BAMBI INSTALLATION REQUIREMENTS ==
  
TODO
+
# Install script used for installation (also Gazebo, which is not necessary on RaspberryPI): [[https://dev.px4.io/en/setup/dev_env_linux.html#gazebo-with-ros https://dev.px4.io/en/setup/dev_env_linux.html#gazebo-with-ros]]
 +
# <code>sudo apt install libxml2-dev libxslt-dev python-dev python-rosinstall ros-kinetic-geodesy ros-kinetic-control-toolbox</code>
 +
# <code>sudo -H pip install future lxml pykml libgeographic-dev</code>
  
 
== Project Bias Point ==
 
== Project Bias Point ==
Line 29: Line 31:
 
* Video TX Ts5823/5828 (300/600mW video transmitter 5.8GHz)
 
* Video TX Ts5823/5828 (300/600mW video transmitter 5.8GHz)
 
* Video RX 5.8gHz av output or otg (uvc)
 
* Video RX 5.8gHz av output or otg (uvc)
* Sonar ([[WT81B003-0202 (Ultrasonic long distance sensor)]]) OBSELETEX
+
* '''OBSOLETE''' Sonar ([[WT81B003-0202 (Ultrasonic long distance sensor)]])  
 +
* LidarLite V3 (altitude distance sensor) ([https://static.garmin.com/pumac/LIDAR_Lite_v3_Operation_Manual_and_Technical_Specifications.pdf Datasheet])
 +
*TFmini lidar ([https://cdn.sparkfun.com/assets/5/e/4/7/b/benewake-tfmini-datasheet.pdf Datasheet])
  
 
== Mimmo ==
 
== Mimmo ==
Line 53: Line 57:
 
== Project LOG ==
 
== Project LOG ==
  
Intensive project activity log:
+
Intensive project activity log (see [[Activity Log]]):
  
 
# [[Day 1]]
 
# [[Day 1]]
Line 62: Line 66:
 
# [[Day 6]]
 
# [[Day 6]]
 
# [[Day 7]]
 
# [[Day 7]]
 +
# [[Day 8]]
 +
# [[Day 9]]
 +
# [[Day 10]]
 +
# [[Day 11]]
 +
# [[Day 12]]
 +
# [[Day 13]]
 +
# [[Day 14]]
 +
# For the last 3 days, see [[Activity Log]]
  
 
See also [[Longterm TODOs]]
 
See also [[Longterm TODOs]]
  
 
== Collection of related stuff that may be useful ==
 
== Collection of related stuff that may be useful ==
 +
 +
* [https://dev.px4.io/en/ PX4 Development Guide and Architectural Overview]
 
* [https://lirias.kuleuven.be/bitstream/123456789/490846/1/VISAPP_15_DHU.pdf How to choose the best embedded processing platform]
 
* [https://lirias.kuleuven.be/bitstream/123456789/490846/1/VISAPP_15_DHU.pdf How to choose the best embedded processing platform]
 
* [https://dashcamtalk.com/forum/threads/xiaomi-yi-camera-gui-control-configure-from-pc-win-lin-mac.11206 Gui and py script to control YI Cam] (e.g. shoot trigger)
 
* [https://dashcamtalk.com/forum/threads/xiaomi-yi-camera-gui-control-configure-from-pc-win-lin-mac.11206 Gui and py script to control YI Cam] (e.g. shoot trigger)
 +
** See also [https://gist.github.com/SkewPL/f57e6cff7fa14601f6b256926aa33437 https://gist.github.com/SkewPL/f57e6cff7fa14601f6b256926aa33437]
 
* [https://larrylisky.com/2016/11/24/enabling-raspberry-pi-camera-v2-under-ubuntu-mate/ how to configure PI cam on ubuntu]
 
* [https://larrylisky.com/2016/11/24/enabling-raspberry-pi-camera-v2-under-ubuntu-mate/ how to configure PI cam on ubuntu]
 
* [https://link.springer.com/chapter/10.1007/978-3-319-50835-1_49 ETHZ Detection and tracking of possible human victims using thermal and visual cameras in real time]
 
* [https://link.springer.com/chapter/10.1007/978-3-319-50835-1_49 ETHZ Detection and tracking of possible human victims using thermal and visual cameras in real time]

Latest revision as of 19:36, 26 September 2018

Welcome to the documentation Wiki of Bambi

An introduction needs still to be made. Some links to youtube videos of similar projects:

Contents

BAMBI INSTALLATION REQUIREMENTS

  1. Install script used for installation (also Gazebo, which is not necessary on RaspberryPI): [https://dev.px4.io/en/setup/dev_env_linux.html#gazebo-with-ros]
  2. sudo apt install libxml2-dev libxslt-dev python-dev python-rosinstall ros-kinetic-geodesy ros-kinetic-control-toolbox
  3. sudo -H pip install future lxml pykml libgeographic-dev

Project Bias Point

The projects aims to develop an automated solution for the very specific use-case. The already developed and highly stressed concept of UVA is the starting point of the project. So we won't develop new techniques regarding these basic issues. Therefore we start with a DRONE that is able to:

  1. Hold a desired altitude
  2. Keep / fly to a desired GPS position
  3. Compensate external disturbances
  4. Provide ground communication for troubleshooting and development


To achieve this we use the following basic hardware:

  • Pixhawk flashed with PX4 flight stack (a featuring open source flight controller)
  • NEO-M8n (GPS)
  • 3DR telemetry radio 433Hz (to have a serial link to the ground station and communicate to the flight board trough Mavlink protocol)
  • Video TX Ts5823/5828 (300/600mW video transmitter 5.8GHz)
  • Video RX 5.8gHz av output or otg (uvc)
  • OBSOLETE Sonar (WT81B003-0202 (Ultrasonic long distance sensor))
  • LidarLite V3 (altitude distance sensor) (Datasheet)
  • TFmini lidar (Datasheet)

Mimmo

Simulation

Project Milestones

Project LOG

Intensive project activity log (see Activity Log):

  1. Day 1
  2. Day 2
  3. Day 3
  4. Day 4
  5. Day 5
  6. Day 6
  7. Day 7
  8. Day 8
  9. Day 9
  10. Day 10
  11. Day 11
  12. Day 12
  13. Day 13
  14. Day 14
  15. For the last 3 days, see Activity Log

See also Longterm TODOs

Collection of related stuff that may be useful

Getting started with MediaWiki