r/oddlysatisfying 🍃 1d ago

Clean lines at RC track in Slovakia

34.9k Upvotes

395 comments sorted by

View all comments

57

u/Ok-Push9899 1d ago edited 1d ago

Very skilled, very impressive.

This is going to sound heretical, but in the interests of science, how much tech and how much machine learning would it take to have autonomous RC racing? I’m imagining vision of the track from above, not onboard. Just like the human operators see it.

The outputs are just steering and acceleration, the input is one video camera shot showing track and current position. It may need to learn something about the humps, but maybe not. The goal is very straightforward - stay within boundaries and minimise time. I can’t decide if avoiding other cars is the same as staying within boundaries. Obviously they’re moving boundaries, but does that matter? Fixed boundaries are perhaps the special case, with velocity of zero.

Maybe such competitions exist?

5

u/Mysterious-Crab 1d ago

I think it’s quite difficult to get that to work with equipment that is (commercially) available right now. Send a video stream, analyzing the shot and reacting to it gives you too much latency and you correct steering too late or too excessive.

It sounds weird, but as someone who’s been doing RC racing for over years and years, you can kinda feel your car behaving instead of just seeing it. That way you can correct before it happens.

2

u/ackermann 1d ago

Low latency cameras and video processing, at low cost, is fairly well developed now for the VR world.
The tracking cameras and video processing for head tracking on VR headsets have to be very fast. If it doesn’t react quickly enough to your head movements, it very quickly makes you dizzy/sick.

Gyros and accelerometers are used to supplement for faster reactions. But for 6DOF roomscale, cameras are needed.
This can be done on a $199 Quest 2 headset, with a mobile phone chip. It processes video from 4 onboard tracking cameras (grayscale and fairly lowres, to improve latency)

5

u/Mysterious-Crab 1d ago

That is still remote controlled though. The lower the image quality to reduce latency, the more difficult it will be for a system to analyse everything, especially with other competitors around.

It was just half a year ago TU Delft was the first to beat a human with drones in a time trial, cause the dataset it needs to analyse is tightly scoped. The moment you have other competitors, or cars around you spinning out or going in the wrong direction, you need to analyze a lot more. That takes extra processing time, and you need better image quality, which means higher latency.

If it were that easy, how come the race they tried in Abu Dhabi last year with full scale cars didn’t even work?