feature article
Subscribe Now

Rescuing Doctor Who’s K-9

Computer Science Student Brings the Doctor’s Companion Back from Oblivion

Optimism: Belief that everything will work out well. Irrational, bordering on insane. – K-9, “The Armageddon Factor, Episode One,” Doctor Who, 1977

When Abertay University student and computer science major Gary Taylor found the fried, water-damaged remains of K-9, Doctor Who’s reliable robotic canine companion, in a university lab last September, he made the remote-controlled robot celebrity the focus of his final-year dissertation even before getting the go-ahead from his project supervisor, Dr. Ian Ferguson. As Taylor explains, ““I love robotics, I love programming, I love dogs, and I love Doctor Who.”

The university purchased this K-9 during the 2011-2012 academic year, one of roughly a dozen prop K-9s sold off by the BBC. It had been abandoned over time. When Taylor found this lost little robot dog in a lab, he discovered that a roof leak had destroyed all of the interior electronic and electromechanical components, leaving only K-9’s easily recognized shell.

The original K-9 was introduced into the long-running “Doctor Who” television series in 1977. It was radio-controlled, and, because it was based on 1970s-era analog RC gear, the original K-9 was a fairly unreliable prop, as were most RC props of the era, like the R2-D2 robot prop used in “Star Wars.” The RC electronics and the broadcast video electronics in the studio interfered, causing the robot to crash into sets and frequently forcing many scene retakes.

Taylor’s plan was to not only restore the K-9 robot but to make it better using 21st-century electronics. The dissertation’s title essentially explains the entire project: “Creating an Autonomous Robot Utilizing Raspberry Pi, Arduino, and Ultrasound Sensors for Mapping a Room.” This was no publicity stunt to reanimate a favorite TV mascot. It was a serious senior engineering project with the goal of creating an autonomous robot based on some very advanced electronics and software that merely lives inside a favorite TV mascot’s shell.

The project started in October of last year, and Taylor unveiled the restored and vastly improved K-9 last week at Abertay’s Digital Graduate Show.

From the start, Taylor decided to build a distributed embedded system based on a Raspberry Pi 3 Model B and an Arduino Mega2560. He recognized that the project stood a better chance of success by offloading the real-time tasks—motor and sensor control—to the Arduino’s on-board microcontroller, which would allow the quad-core, 1.2GHz, 64-bit Arm Cortex-A53 microprocessor inside the board’s Broadcom BCM2837 application processor to handle the high-level mapping functions. The plan also included using the Raspberry Pi 3 Model B’s built-in Bluetooth wireless capabilities to link to a OnePlus 3T mobile phone. A custom-built app running on the phone serves as a handheld remote control for the robot.

Here’s the resulting block diagram of the design project, taken from Taylor’s dissertation:

 

The Arduino Mega2560 directly operates the robot’s three HC-SR04 ultrasound sensors using GPIO pins and an existing ultrasound library. One sensor faces directly forward and the other two sensors are angled to the left and the right. The three white sensors appear along the bottom edge of the robot in this photo:

The Arduino Mega2560 controls the robot’s left and right traction motors through a Monster Moto Shield plugged into the Arduino’s shield headers. These motors drive a pair of wheels while a third, undriven pivot wheel towards the back of the robot provides stability. In aircraft jargon, this K-9 is a tail-dragger.

The motor-control shield uses a pair of ST Microelectronics VNH2SP30 full-bridge, high-current motor drivers capable of handling motor loads as large as 30A at 41V, although the robot uses much lower motor voltages and currents. As you can see from the block diagram above, the Arduino directly controls the VNH2SP30 full-bridge motor drivers on the shield through the Arduino’s existing PWM outputs using an existing Arduino software library. If you shop online, you can find copies of this motor-control shield for less than $6. At that price, you can hardly afford to roll your own—especially for a one-off project like this K-9 project.

In the original project plan, Taylor planned to use the Arduino to also control a TDK InvenSense MPU-6050 6-DOF (degrees of freedom) MEMS IMU (inertial measurement unit). The MPU-6050 combines a 3-axis gyroscope and a 3-axis accelerometer on the same silicon die and is popular for use in dead-reckoning guidance systems. Basically, it’s a motion-tracking sensor. The MPU-6050 sensor communicates over I2C, and it’s a favorite device for robotics experimenters, so there’s naturally an Arduino library for the MPU-6050 sensor as well as inexpensive breakout boards that make it easy for experimenters to work with the small SMT device.

The Raspberry Pi 3 Model B sends control messages to and receives sensor information from the Arduino Mega2560, which communicates over a USB connection. Taylor’s dissertation discusses serial communications alternatives such as asynchronous UART or I2C. However, both the Raspberry Pi and the Arduino have robust, community-supported libraries for USB communications, so that’s what he ultimately chose.

Using the sensor data obtained from the Arduino Mega2560, the Raspberry Pi 3 Model B would create a location and obstacle map as the robot explored its surroundings. The map is stored in an on-board SQLite 3 database, which is public-domain code already compiled to run on the Raspberry Pi.

As mentioned, the Raspberry Pi also supplies the Bluetooth communications to the OnePlus 3T cellular phone. For this, Taylor used the PyBluez Python-based wrapper for the official Bluez Linux Bluetooth stack. The Raspberry Pi 3 Model B is running the Raspbian flavor of the Debian operating system, which is based on the Linux kernel.

Taylor’s dissertation charts a methodical approach to the project and includes a Gantt chart:

 

The first planned steps in the project included prototyping and coding the Arduino-based motor control functions. After all, you need a moving robot to continue with the next development steps. Once the basic sensor and motor platform could move around and avoid obstacles, Taylor planned to add mapping capabilities and autonomy via the Raspberry Pi.

This is a very ambitious engineering project, especially for a fourth-year undergraduate student, so there’s no surprise in finding out that the project didn’t progress exactly as planned on the Gantt chart. After all, that rarely happens in real life.

After full integration, Taylor made a few test mapping runs and got inconsistent results. In the first test, which lasted three minutes, the robot detected objects that didn’t exist. In the second test, which lasted five minutes, the resulting map appeared to be too small. Something was clearly amiss.

The third test, another 3-minute test, appeared to produce better results, so Taylor pressed on by adding an obstacle to the environment before the fourth test. At this point, it became clear that the robot simply did not know where it was. There was a significant problem with the inertial-measurement portion of the design.

The fault lay in the interface to the MPU-6050 IMU. Reads from the MPU-6050 resulted in FIFO overload errors and the data from the IMU could not be used. If the robot didn’t know where it was and what route it had taken to get there, it could not draw an accurate map.

That’s when time ran out. Taylor could control his robot using an app running on the cellular phone but the robot could not navigate autonomously.

In his dissertation, Taylor writes:

“The scale of the project was much larger than expected from the proposal stage of the project. Whenever a project scope gets added up or modified or enhanced, it leads to scope creep in the project. The scope creep has an impact on project quality and success factors such as cost, time, function points, number of developers etc.”

Later, in a section labeled “Future Work,” he writes:

“There is much room for improvement in this project. The final model was only a prototype to display the functionality of the device; changes would be required for a better functioning device.

“Firstly, the robot software design needs to be changed to include the Gyroscope. The Gyroscope can make the robot motion more accurate and can be used to set the robot direction easier and solve the problem of the offset of the device. This would provide much more accurate results in terms of accurate data.”

Although he did not ultimately develop an autonomous robot but more of a remote-controlled robot, this project was a huge success. Taylor gained many of the project-development skills he will need going forward, including careful project planning, good ongoing documentation, and the need to avoid deathly feature creep.

Those are the obvious engineering skills he’s learned from his project.

Taylor is also beginning to learn about another important skill—a soft one. Autonomous robot projects in universities are common. They’re cool, but they’re common, and none are likely to catch the public’s attention. However, by encasing his autonomous robot in a celebrity skin, K-9’s skin, he elevated his project’s recognition factor by many orders of magnitude.

Taylor and his revived K-9 have now appeared in UK newspapers including The Scottish Sun, The Courier, and The Daily Record as well as the SyFy cable channel’s online news site SyFyWire and the Doctor Who fan site The Gallifreyan Newsroom. Taylor’s project has now exposed him to some of the fundamentals of marketing that every high-tech entrepreneur should master.

featured blogs
Mar 18, 2024
If you've already seen Vivarium, or if you watch it as a result of reading this blog, I'd love to hear what you think about it....
Mar 18, 2024
Innovation in the AI and supercomputing domains is proceeding at a rapid pace, with each new advancement heralding a future more tightly interwoven with the threads of intelligence and computation. Cadence, with the release of its Millennium Platform, co-optimized with NVIDIA...
Mar 18, 2024
Cloud-based EDA tools are critical to accelerating AI chip design and verification; see how NeuReality leveraged cloud-based chip emulation for their 7NR1 NAPU.The post NeuReality Accelerates 7nm AI Chip Tape-Out with Cloud-Based Emulation appeared first on Chip Design....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured paper

Reduce 3D IC design complexity with early package assembly verification

Sponsored by Siemens Digital Industries Software

Uncover the unique challenges, along with the latest Calibre verification solutions, for 3D IC design in this new technical paper. As 2.5D and 3D ICs redefine the possibilities of semiconductor design, discover how Siemens is leading the way in verifying complex multi-dimensional systems, while shifting verification left to do so earlier in the design process.

Click here to read more

featured chalk talk

Current Sense Shunts
Sponsored by Mouser Electronics and Bourns
In this episode of Chalk Talk, Amelia Dalton and Scott Carson from Bourns talk about the what, where and how of current sense shunts. They explore the benefits that current sense shunts bring to battery management and EV charging systems and investigate how Bourns is encouraging innovation in this arena.
Jan 23, 2024
8,138 views