@Jeremy-Schmidt Hi,
It's resolved now, in the mean time I had a response from the support.
Have a good day
Julien
Posts made by Judoor 0
-
RE: Motor stop functionning on starling 2
-
RE: Motor stop functionning on starling 2
@Alex-Kushleyev I sent an email to the modal ai support for the RMA process and after a first response telling me that they'll check it, I still got no news from them and it has been 3 weeks. Do you know some other way to get in touch with the support. My starling is stuck to the ground since and I can't test anything.
Thanks
Julien -
RE: apt upgrade error
@Judoor-0 after that i did an apt autoremove and it deleted almost all my drone... is it possible to get back from it?
here is my voxl-inspect-servicesAnd command like voxl-configure-mpa are not available anymore.
If I flash a new sdk on the starling, does it delete everything that's in /home/root ?
Julien
-
apt upgrade error
HI,
I'm trying to upgrade a starlin2 but when I doapt-update && apt-upgrade
i'm left whith this error and I don't know how to rsolve it :
Preparing to unpack .../wpasupplicant_2%3a2.6-15ubuntu2.8+esm1_arm64.deb ... Unpacking wpasupplicant (2:2.6-15ubuntu2.8+esm1) over (2:2.6-15ubuntu2.8) ... dpkg: error processing archive /var/cache/apt/archives/wpasupplicant_2%3a2.6-15ubuntu2.8+esm1_arm64.deb (--unpack): trying to overwrite '/lib/systemd/system/wpa_supplicant.service', which is also in package voxl2-wlan 1.0-r0 dpkg-deb: error: paste subprocess was killed by signal (Broken pipe) Failed to add /run/systemd/ask-password to directory watch: No space left on device Errors were encountered while processing: /var/cache/apt/archives/wpasupplicant_2%3a2.6-15ubuntu2.8+esm1_arm64.deb E: Sub-process /usr/bin/dpkg returned an error code (1)
Julien
-
RE: RAM building up fast
@tom
Here is a picture of my services :
It happened only the other day, multiple time.
But I've tested the drone again today and ram seems good, never going above 3590MB/7671MB.Maybe one of my code was running in the back, I don't really know. If it happen again i'll tell you. And send you more precise information on what's running
Thanks for your help.
Julien
-
RAM building up fast
Hello,
I have the Starling 2 development kit, and during long-term testing, I've noticed that RAM usage increases very quickly (to the point of crashing the drone), even though the drone is on the bench and I’m only connected via SSH without running any external programs. I'm using SDK version 1.3.5. Has this happened before, and how can I clear the RAM to prevent the drone from crashing?Julien
Setup : Starling 2 , SDK 1.3.5,
-
RE: Motor stop functionning on starling 2
@Alex-Kushleyev it was an original screw, I don't know why but when comparing to another it's slightly longer. I'll request an RMA.
Julien
-
VIO navigation with QGC wayypoint
Hi,
I want to test the vio waypoint functionnality, but when starting a mission I get the error : no global position.
How can i give a global position to qgc, knowing i only use VIO.
Julien -
RE: VIO waypoint navigation
@Cliff-Wong Hi,
I'm trying to implement the vio navigation without gps. i'm on sdk 1.3.2 with a starling 2.
When uploading the waypoint to the drone, the global position of the drone isn't reference to the take off point. in fact it doesn't appear at all on the map.
What can I do to fix that ?
Julien -
Motor stop functionning on starling 2
Hi,
I've got a starling 2 with the black motor and today while testing one of the motor started to stop spinning randomly. I checked the wires and everything seems good. I swapped the motor with another functional one on my drone and the same motor didn't worked, so the ESC is good.
So, I swapped the defective motor with another one that i had in stock and put the props on it and this motor didn't work either!
Well, the problem came frome one of the propeller's screw which is slightly longer than usual and come scratching on top of the motor wire and prevent the motor to turn.So, I need a new motor for my starling.
Julien
-
RE: Install libopencv on Starling 2
@Prabhav-Gupta I didn't use the stereo data because I don't have one on my drone. I used the point cloud returned by the TOF. But I know that with RTAB-Map you can easily use stereo without depth. There are some examples that may help you in the rtabmap_ros website.
-
RE: Install libopencv on Starling 2
@wilkinsaf Does mpa to ros do the same thing? Because I currently use ROS1. And for ROS1, mpa to ros only sends camera images or point cloud, not camera_infos.
-
RE: Install libopencv on Starling 2
@wilkinsaf Okay, thanks for your reply. Sorry, that was not clear. What I mean by data for cameras is data that I can send to RTAB-Map as a sensor_msgs/CameraInfo message. There is some data in the extrinsics files for tracking, but not for other sensors.
-
Problem with TOF sensor on the new Starling 2
Hi,
I've been developping on the starling 2 for some times now but last week, my TOF sensor stopped working and it's like it doesn't eaven exist now.
I've tried to recover it by flashing everything to default multiple time but it didn't work.
I don't know what it is but now I'm stuck, I cannot continue on my project, can you help me ?
Julien
-
Sensors simulation with Gazebo
Hello !
I would like to know if it's possible to simulate the sensors (TOF, Hires and Tracking) of the Starling 2 on Gazebo. Is there any ressources that I can use for that ?
Any links or help will be welcome.
Thanks !
-
RE: Install libopencv on Starling 2
@Alex-Kushleyev @wilkinsaf Thanks for your answers! Actually, I've done something similar with docker, but I still have a few problems with the data format that mpa_to_ros sends. To address this, I did the following:
#!/usr/bin/env python import rospy import cv2 import numpy as np from sensor_msgs.msg import Image, CameraInfo from cv_bridge import CvBridge, CvBridgeError import yaml class CamImageConverter: def __init__(self): self.bridge = CvBridge() # Subscribers rospy.Subscriber("/drone/tof_depth/image_raw", Image, self.depth_callback) rospy.Subscriber("/drone/hires/image_raw", Image, self.rgb_callback) rospy.Subscriber("/drone/tracking_front/image_raw", Image, self.tracking_callback) # Publishers self.depth_pub = rospy.Publisher("/drone/tof_depth/image", Image, queue_size=1000) self.rgb_pub = rospy.Publisher("/drone/hires/image", Image, queue_size=1000) self.rgb_info_pub = rospy.Publisher("/drone/hires/camera_info", CameraInfo, queue_size=1000) self.tracking_pub = rospy.Publisher("/drone/tracking_front/image", Image, queue_size=1000) self.tracking_info_pub = rospy.Publisher("/drone/tracking_front/camera_info", CameraInfo, queue_size=1000) self.camera_info_msg = self.load_camera_info("/root/ros_ws/src/cam_converter/calibration/ost.yaml") rospy.Timer(rospy.Duration(1.0 / 30), self.publish_tracking_info) rospy.Timer(rospy.Duration(1.0 / 30), self.publish_rgb_info) def load_camera_info(self, filename): with open(filename, "r") as file: calib_data = yaml.safe_load(file) camera_info_msg = CameraInfo() camera_info_msg.height = calib_data["image_height"] camera_info_msg.width = calib_data["image_width"] camera_info_msg.distortion_model = calib_data["distortion_model"] camera_info_msg.D = calib_data["distortion_coefficients"]["data"] camera_info_msg.K = calib_data["camera_matrix"]["data"] camera_info_msg.R = calib_data["rectification_matrix"]["data"] camera_info_msg.P = calib_data["projection_matrix"]["data"] return camera_info_msg def depth_callback(self, msg): try: # Convertir le message ROS en une image OpenCV raw_image = self.bridge.imgmsg_to_cv2(msg, desired_encoding="mono8") # Normalisation de l'image pour améliorer la qualité depth_image = cv2.normalize(raw_image, None, 0, 65535, cv2.NORM_MINMAX).astype(np.uint16) # Publier l'image convertie converted_msg = self.bridge.cv2_to_imgmsg(depth_image, encoding="16UC1") converted_msg.header.frame_id = 'tof_link' converted_msg.header.stamp = rospy.Time.now() self.depth_pub.publish(converted_msg) except CvBridgeError as e: rospy.logerr("CvBridge Error: {0}".format(e)) def rgb_callback(self, msg): try: # Convertir le message ROS en une image OpenCV raw_image = self.bridge.imgmsg_to_cv2(msg, desired_encoding="passthrough") # Conversion YUV422 ou NV12 en RGB8 if msg.encoding == "yuv422": rgb_image = cv2.cvtColor(raw_image, cv2.COLOR_YUV2RGB_Y422) elif msg.encoding == "nv12": yuv_image = np.frombuffer(msg.data, dtype=np.uint8).reshape((msg.height * 3 // 2, msg.width)) rgb_image = cv2.cvtColor(yuv_image, cv2.COLOR_YUV2RGB_NV12) else: rospy.logerr("Unsupported encoding: {}".format(msg.encoding)) return # Redimensionner l'image RGB pour qu'elle ait la même taille que l'image ToF (240x180) resized_rgb_image = cv2.resize(rgb_image, (180, 240), interpolation=cv2.INTER_LINEAR) # Publier l'image convertie et redimensionnée converted_msg = self.bridge.cv2_to_imgmsg(resized_rgb_image, encoding="rgb8") converted_msg.header.frame_id = 'hires_link' converted_msg.header.stamp = rospy.Time.now() self.rgb_pub.publish(converted_msg) except CvBridgeError as e: rospy.logerr("CvBridge Error: {0}".format(e)) def tracking_callback(self, msg): try: # Convertir le message ROS en une image OpenCV raw_image = self.bridge.imgmsg_to_cv2(msg, desired_encoding="passthrough") # Redimensionner l'image RGB pour qu'elle ait la même taille que l'image ToF (240x180) resized_image = cv2.resize(raw_image, (180, 240), interpolation=cv2.INTER_LINEAR) # Publier l'image convertie et redimensionnée converted_msg = self.bridge.cv2_to_imgmsg(resized_image) converted_msg.header.frame_id = 'tracking_front_link' converted_msg.header.stamp = rospy.Time.now() self.tracking_pub.publish(converted_msg) except CvBridgeError as e: rospy.logerr("CvBridge Error: {0}".format(e)) def publish_rgb_info(self, event): self.camera_info_msg.header.stamp = rospy.Time.now() self.camera_info_msg.header.frame_id = 'hires_link' self.rgb_info_pub.publish(self.camera_info_msg) def publish_tracking_info(self, event): camera_info_msg = CameraInfo() camera_info_msg.header.frame_id = 'tracking_front_link' camera_info_msg.header.stamp = rospy.Time.now() camera_info_msg.height = 240 camera_info_msg.width = 180 camera_info_msg.distortion_model = 'plumb_bob' camera_info_msg.D = [5.0234834483837899e-02, 3.2967878240805777e-02, -1.6468916528340826e-02, 1.8155367951008903e-03] camera_info_msg.K = [4.6177826259848081e+02, 0., 6.1505488982380564e+02, 0., 4.6160870283061041e+02, 4.1065345867690684e+02, 0., 0., 1.] camera_info_msg.R = [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0] camera_info_msg.P = [4.6177826259848081e+02, 0., 6.1505488982380564e+02, 0., 0., 4.6160870283061041e+02, 4.1065345867690684e+02, 0., 0., 0., 1., 0.] self.tracking_info_pub.publish(camera_info_msg) def main(): rospy.init_node('cam_image_converter', anonymous=True) CamImageConverter() rospy.spin() if __name__ == '__main__': main()
I don't know if this is the right way to do it, but it seems to work. Do you have any suggestions for implementing this directly on the Voxl2?
Also, I have a few problems with camera calibrations and the drone model. Do you have any files that contain all the data for the cameras (ToF, tracking, and high-resolution) and a model of the drone that I can use in RViz and Gazebo? This could really help me.
For more context, I'm trying to implement this project for Autonomous SLAM with ROS. Has anyone already used RTAB-Map on this drone? Do you have any advice to solve this problem?
Thank you for your help
-
Install libopencv on Starling 2
Hello,
I tried to install a ROS package on my Starling 2, but I encountered this error:
I have a few questions about that:
- Is there any way I can use the VOXL libraries to substitute those libraries?
- If I need a Docker:
- Is there a Docker tutorial I can follow to help me build this image? Is there any image available with ROS Noetic for the drone?
- How can I interface Docker with my drone using ROS or my hardware?
Thanks for your answers.
-
RE: Starling 2 doesn't detect camera after flashing new SDK
@tom I redid the install of the sdk and yes this time only this two option appeared :
- M0054-1 -> QSM8250, Starling (D0005), FPV (D0008)
- M0054-1 -> QSM8250, Starling 2 Max (D0012), Starling 2 (D0014)
And it worked, thank you!
The weird things is that I did the exact same sdk install twice yesterday and was never serve with this two option only. There was like 6 option available, is it because I directly started the drone in FASBOOT Mode ?
-
Starling 2 doesn't detect camera after flashing new SDK
Hi there,
I had some problems with libraries that I installed in double, so I decided to flash a new SDk that solves my problems. The installation was successful, but I have a problem.
I have the very latest version of the Starling 2 and when I run configure-sku and run configure-mpa after, the drone doesn't detect the cameras.
I have chosen all the parameters just like my other drone (I have 2 similar drones that I ordered at the same time) and it doesn't work.
Does anyone have a solution to my problem?
Thank you for your answers.