Cameras are Bad….on their Own

Autonomous Systems

Photo by Tech Nick on Unsplash
  1. Sensing — This step mainly focuses on collecting data, A LOT of it.
  2. Perceiving — This step focuses on understanding the data.
  3. Plan — This step focuses on figuring out what to do with the data and find a path.
  4. Act — This step focuses on controlling the device and following the plan.

Quality of Data

Improving the quality of your data is a big part of sensor fusion. It uses different AI algorithms to understand which ones work the best. The more data points you get data from, the better your data is.

Increase Reliability

Having more sensors increases your reliability because the more data the better as you can recognize outliers a lot easier.

Measuring the Immeasurable States

This is what we talked about right at the beginning of the article. With different types of sensors, you can use another sensor to improve upon a sensor's weaknesses. If you look at the chart above, you can see the different strengths and weaknesses of Cameras, Lidar and Radar.

Increase Coverage

It also tests out different algorithms to see which one works the best, so you can have the best data and results.

Example of Central Limit Theorem
  • Kalman filter
  • Bayesian networks
  • Dempster-Shafer
  • Convolutional neural network

The Future

Photo by Robynne Hu on Unsplash

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Varsha Prasad

Varsha Prasad

A student interested in emerging tech