Focus Areas

Cybersecurity

Cybersecurity is the practice of safeguarding computer systems, networks, and data from theft, damage, or unauthorized access. As robotics have increasingly integrated into various sectors such as healthcare, manufacturing, transportation, and domestic applications, the importance of cybersecurity in these machines has become paramount. Robots, often embedded with sensors, actuators, and complex control algorithms, rely heavily on data communication and processing. Any breach in their systems could lead to the compromise of sensitive data or, even more concerning, the malicious control of the robot itself.

Unmanned Ground Vehicles (UGVs) are integral in military, industrial, and civilian sectors, necessitating stringent cybersecurity. As UGVs operate autonomously in diverse environments, their protection against cyber threats is vital. A compromised UGV can lead to espionage, data breaches, or even physical damage. Consequently, cybersecurity for UGVs emphasizes secure communication, encrypted commands, and intrusion detection. Regular firmware updates further bolster their defense. Ensuring UGVs’ digital integrity is crucial for both mission success and environmental safety.

Edge Device AI

Our research implements machine learning models for object detection and classification directly on edge devices like the NVIDIA Jetson Nano. By running AI on the edge rather than cloud, we enable real-time inferencing with low latency on autonomous robots like Spot. This edge device AI research at the CARDS lab aims to enable responsive robot perception and navigation without reliance on connectivity. We develop optimized deep neural network models that are trained in simulation using synthetic datasets and deployed on the Jetson Nano integrated on the Spot. This allows Spot to leverage AI for on-device inferencing to perceive and understand its environment using just its onboard compute.

Internet Of Things

A comprehensive full-stack website was developed that serves as a robust data management and visualization platform. The website is designed to collect and store various data, including Wi-Fi communication details, battery status, and GPS location information, which are transmitted from a GPS module running on a Jetson Nano device using RabbitMQ. The backend of the system leverages ElasticSearch as its primary repository, providing a scalable and efficient storage solution for the incoming data. This Elasticsearch repository not only facilitates data storage but also enables fast and flexible querying for retrieval. Data from this is sent to the front end to ensure real-time tracking and monitoring of the Jetson Nano’s location. The integration with RabbitMQ ensures the efficient and reliable transmission of data to the backend of the website, using users can access a live location feed displayed on the front-end. This live location feature provides users with valuable insights into the real-time whereabouts of the Jetson Nano device. The project showcases the power of full-stack development, enabling data collection, storage, and visualization through a user-friendly frontend, offering real-time tracking capabilities that can have applications in various fields, from asset tracking to remote monitoring and also displaying of these data on the Kibana dashboard.

Swarm Robotics (Multi-Robot Coordination)

Swarm robotics aims to coordinate large numbers of relatively simple robots to achieve complex goals. Our project utilizes advanced robots like Spot, Jackal and Husky to create robotic swarms for tasks like search and rescue operations. Leveraging the mobility of Spot, the versatility of Jackal and the ruggedness of Husky, heterogeneous swarms will be created with complementary capabilities. By developing decentralized control algorithms and leveraging robot-to-robot communication, emergent swarm behaviors can be achieved for mapping disaster sites, locating victims and clearing obstacles as a coordinated team. This project will push the boundaries of swarm robotics by creating capable multi-robot systems that can mimic the power of natural swarms.

Smart City (Duckie town)

Duckietown is an affordable, open-source platform designed for autonomy education and research, consisting of small autonomous vehicles (“Duckiebots”) made from standard components and miniature cities (“Duckietowns”) with roads, traffic lights, and other urban features. Despite their simple construction, using only a monocular camera and a Jetson Nano for processing, Duckiebots can perform complex tasks such as lane-following, obstacle avoidance, and city navigation. This platform saves educators and researchers the effort and expense of building the underlying infrastructure, and with all materials being open-source, the community is encouraged to adopt and adapt it for their needs.

Machine Learning

ATR in Camouflaged Conditions

Automatic target recognition (ATR) under camouflaged conditions is a challenging task that involves detecting and identifying objects or targets concealed by their surroundings. Camouflage, the art of deception often used in the animal world, is also employed on the battlefield to hide military assets. Camouflaged objects blend seamlessly into their environments by adopting colors and textures like their surroundings. This diversity in camouflage forms, including natural or artificial foliage, paint, or texture, makes it challenging for ATR systems to identify and localize targets. Camouflaged object detection (COD) finds wide-ranging applications in various fields, encompassing medical diagnosis (e.g., polyp segmentation and lung infection segmentation), industry (e.g., product inspection and anomaly detection), agriculture (e.g., locust detection to prevent invasion), security and surveillance (e.g., search-and-rescue missions, pedestrian detection, and automatic driving in bad weather), scientific research (e.g., rare species discovery), and even in the field of art (e.g., recreational art and photo blending).
In our research, we employ various Machine Learning (ML) techniques and propose novel approaches that harness multimodal visual and perceptual knowledge for camouflaged object detection in contested environments. Our work places significant emphasis on the development of well-represented depth maps in camouflaged conditions, thereby facilitating enhanced object detection capabilities. Moreover, we explore model deployment on resource-constrained edge devices such as Raspberry Pi, Jetson Nano, and Jetson Xavier to assess real-world deployable performance.

Milli-Meter Wave Radar

The main idea was to detect moving objects in an open space using AWR 1642 BOOST mmWave Radar. Initially a wavelet transformation is performed on the radar signal and this decomposes the signal into different frequency bands, which allows us to focus on the bands that are most likely to contain moving targets. The wavelet transform is followed by an FIR filter, which further removes any remaining noise from the signal. Finally, a peak detector is used to identify the peaks in the signal, which correspond to the locations of moving targets. This method is more robust to extreme data variation, which makes it well-suited for the challenging environment of open space. Digital signal processing pipeline, comprises a wavelet denoiser, pulse Doppler filter, and peak detection algorithm, surpasses traditional machine learning-based methods. In applications demanding precise target detection, from autonomous vehicles to surveillance, our method offers a significant advancement. It underscores the potential of signal processing techniques to enhance radar-based sensing systems, with far-reaching implications for robotics, transportation, and security.

Spot Robot Navigation Modalities: Auto Nav, Voice Control, Gesture Control