Bilateral teleoperation over network (source code and video)

It’s been more than a year since I made a post as I’ve been quite busy to be honest. Thankfully, I passed my PhD viva and I’m currently working as a researcher at King’s College London. Currently on my way to come up with a research plan, nevertheless, due to other various tasks I do in parallel, I managed to create some C++ code for bilateral teleoperation, i.e. teleoperation with force feedback using two Geomagic Touch devices by 3DSystems over a network. You can find the code on Github.

The bilateral teleoperation system uses a position – position configuration. This means that the angular values of the joints are only exchanged between the devices. The upside is that there is no need for force sensors and external tools, as required by the position-force configuration, to receive forces from the remote environment. This conveniently keeps the system nice and simple but also more functional as all the moving parts of the robotic arm will respond to any change imposed by the remote environment. The downside is that in terms of transparency (quality of telepresence) during contact it is just not as good as the alternative.

The touchp2p.cpp file once compiled on each PC where the Touch devices are connected, runs a PID controller that receives reference angular values of the joints of the other device. It is only required to set the correct IPs of the remote and local machine on each side of the teleoperation system. Continue reading “Bilateral teleoperation over network (source code and video)”

A simple demo on the impact of latency in teleoperation

Hello everyone!

It’s been so long since I wrote a blog post. You can’t imagine how much I’ve been waiting for a reason to start writing and…here it is!

Since the last time I wrote something, I finished with my MSc degree at NKUA and started my PhD at King’s College London. I am now studying and working on haptics over the upcoming 5G network infrastructure. How they work, how to improve it and how to make it more usable for a number of use cases are a few questions I’m looking to answer.

The demo

I’m happy to say that I have just uploaded the first demo I ever made since I joined the KCL-Ericsson 5G lab. It’s rather simple but gets the job done. It also doesn’t fail to impress people who don’t know how it’s like to teleoperate something under latency.

So, here it is: https://github.com/constanton/DelayedTeleoperationDemo

It’s actually a modified version of one of the examples provided by the Chai3D C++ framework that I’ve been using lately. The modification was simply creating one buffer at the position channel and another one at the feedback channel. As you increase the latency (ie. the size of the buffers) above 10ms, the end result is the de-synchronization of the data you receive and the data you send making the haptic device to become unstable…it really starts to “kick” when you touch anything with the grey ball.

The haptic device used is the Sensable Phantom Omni (IEEE 1384 version) which works only under Windows (at least for me). So, in case anyone has made it work under Linux, if possible, please send over a how to 🙂

There is room for improvements, further modifications and optimizations. One idea is to implement at least one stability control algorithm to compare it to the usage without one.

Anyway, here is a pic from the application.

Application screenshot
You can slide the cylinder along the string using the grey ball which you control from the haptic device. You can change the latency (bottom right) with + and – keys.
%d bloggers like this: