Humanity will eventually combine sensing, computation, and actuation (ie. robots) and human biology to create more-capable human "cyborgs" (for lack of a better word). There are a number of good examples of this already: Contact lenses that change their focus, exoskeletons to provide superhuman strength, and even some special prosthetics. By and large, these advances are non-invasive. However, I think we're nearing an era where that will no longer be a restriction -- people will have elective surgeries to add robotic elements to their bodies. My friend Ravi Balasubramanian from Oregon State (recent NSF CAREER Award winner!) is already making great strides in this vein; Ravi designs implantable robotic mechanisms that re-engineer the human hand to restore adaptive grasping capabilities to patients who have lost range-of-motion or grasp strength -- read on for details. This is just the beginning; some day, I imagine technologies where we can programmatically change muscle-tendon connections to trade off dexterity vs. strength (much like humans vs. chimps). Warning: This post contains photos of surgical implants in human cadavers; some people may find such images disturbing.
Robotics has gotten a bit spoiled over the last 8 years in sensing; we now have low cost Kinect-style depth cameras for indoor robots and hyper-expensive laser rangefinders (LRFs) for autonomous cars that sport 30-60+ independent laser beams. But we're still lacking a commercially-available, low-cost, long-range LRF (sub-$500, 15m+ range) -- a gap that is absolutely crucial for robots doing navigation and mapping in larger indoor spaces (eg. warehouses & retail) -- at least until visual SLAM becomes a bit more viable with off-the-shelf solutions that can handle uniformity and poor textures. Until recently, the only option (in hobbyist-style numbers) was to buy a Neato off eBay and hack off its LRF. Now, a startup called Scanse is changing that... They're developing a long-range (40m) scanning LRF for around $300, and their Kickstarter needs some love -- it's literally the first (and only) project I've ever backed, and I hope you'll join me.
2015 was an insane year for robotics companies; they raised $922.7M in VC funding -- 170% more than in 2014. I'm almost certain that it exceeds $1 Billion, especially if you account for funding events in Asia (opaque to me) or if you take into account companies at the periphery of robotics (sensing, software, 3D printing, etc). Similar to previous years, a large portion of the funding went to medical companies and drone companies, but we also saw a lot of late-stage consumer robot financings this year (such as Jibo and Sphero) -- but comparatively few agricultural or service robots. Still, I think it's safe to say: 2015 was the year of the robotics startup!
Every year MIT Technology Review announces their "35 Young Innovators Under 35" Awards (TR35 Awards). In recent years the list has included a few roboticists, and Hizook has covered several of the announcements (here, here, and here). This year is is especially meaningful for two reasons: (1) It includes three roboticists and several other individuals in machine learning and sensing; and (2) I'm one of the winners! The three TR35 roboticists from this year were Conor Walsh (soft, wearable robots at Harvard), Melonee Wise (Willow Garage, and now warehouse robots at Fetch), and Travis Deyle (healthcare robotics plus Internet of Things, now doing healthcare sensors at Google[x] Life Sciences).
I share a commonly-held vision for the Internet of Things wherein mobile devices (such as robots) interact with smart objects -- objects with embedded microelectronics that perform computation, sensing, communication, energy harvesting, and energy storage. The other day, my colleagues and I published an open-access white paper that illustrates one incarnation of this vision: Robots interacting with long-range, "sensorized" RFID tags to provide IoT-style sensing -- i.e. an example of how autonomous mobile robots outfitted with UHF RFID readers could interact with sensor tags to perform tasks such as soil moisture sensing, remote crop monitoring, infrastructure monitoring, water quality monitoring, livestock monitoring, and remote sensor deployment. You can find the white paper on arXiv.org (or direct PDF). Check the paper for exact details, but I'd like to share some of the concepts here on Hizook... with plenty of pretty pictures!
While walking around the Bay Area Maker Faire this weekend, I stumbled across an amazing piece of technology: Valve's "Lighthouse" tracking system. Valve's demos were (supposedly) a major contributor to Oculus' fundraising efforts and ultimate sale to Facebook, and this device may have been a key piece of those demos. I'd heard rumors about this system for many months. It was designed for augmented and virtual reality -- namely, for head and controller tracking, where it needs to have some insane specifications for positional accuracy, angular accuracy, and especially(!!!) latency. Honestly, the rumors didn't do it justice: it's really elegant! This solution is exceedingly simple, low-cost, light weight, and performant. It's much (much!) better than the image processing techniques I've seen to date. Most importantly.... I think this technology could be a "big deal" for robotics too. I had a chance to speak with Valve's Alan Yates about how the Lighthouse system works; I didn't get all the specifications, but I did get some interesting information -- so read on!
I just completed my annual tally for VC funding in robotics in 2014, and the results were pretty amazing. By my primitive calculations, VC funding for robotics increased over 36% compared to 2013 -- totaling a whopping $341.3 Million for 2014. I'll tentatively ascribe the massive gains to: (1) A frothy funding environment; (2) Lots and lots of drone startups; and (3) A series of later stage medical robotics companies raising large sums. If 2014 is any indicator, I think 2015 could be another really big year for robotics!
A few years back I learned about an actuator that is surprisingly simple, has a high gear ratio, was known since ancient times, and can be produced with nothing more than a cheap hobby motor and some string. Crazy, right?! Up until recently I was barred from discussing the actuator due to NDAs with an anonymous startup, despite plenty of academic work on the subject. Thankfully those NDAs expired, and I've received permission to discuss some of that company's findings publicly. Read on for details!
Back in January 2014 I purchased an Anki Drive "robot car" setup, along with an iPad to control it. I had high hopes. Anki had a great story: originated in CMU's robotics program; raised $50M in VC funding (now more than $105M); and launched at Apple's WWDC conference in 2013. It was touted as "bringing robots out of the lab and into homes, using real AI." I couldn't wait to get my hands on one -- to try it out and to tear it apart to figure out how it works. Read on for details, including an inside look at how the robot localization works. But spoiler alert... I was disappointed.
This weekend a new Disney movie came out in theaters: Big Hero 6. I'm super excited for this movie, and I'm sure it will be awesome! Even better, the main supporting character is a big, soft inflatable robot named Baymax. As long-time readers of Hizook will undoubtedly know, I'm a huge proponent of inflatable robots -- they have challenges, but they're tough to beat in terms of cost and power-to-weight ratio! And while the movie was inspired by some early work from CMU, I think this is an ideal opportunity to look at the cutting edge in soft, inflatable robotics -- which to my knowledge is dominated by an Otherlab spinout: Pneubotics. Read on for details, lots of pictures and videos, and a bonus: Watch my wife take a sucker punch to the gut from an inflatable robot! :-P
My good friend Erik Schluntz and I came up with this really cool idea: We should build room-sized 3D printers using winch robots (also called cable robots or rope robots) instead of using big 3-axis gantry systems. The idea has a lot of merit: eliminate bulkiness, vastly-lower robot cost, insanely-big workspaces, and super-easy installation and calibration. We thought a bit about what it could look like as a business (eg. launching via Kickstarter), but opted to pass for the time being. In the interim there are some supremely-entertaining possibilities to scale it up to the size of a football stadium... If anyone can get us access to a stadium, we'd be game for the mother of all weekend hackathons! Anyway, we figured we'd expand a bit on the idea and examine some earlier, related systems.
Several of my friends attended Automatica 2014, where they saw demonstrations of the LogiMover pallet-moving system by Eisenmann. They figured I would be interested (yep!), and emailed me some details. LogiMover has an interesting design twist: It uses two independent forks (ie. two distributed robots) to lift and move pallets around a warehouse or factory floor. While not as versatile as a normal forklift (in the vertical direction), I can definitely appreciate the system's compactness and overall utility.
This is becoming something of a perennial topic for Hizook; the 2011 and 2012 lists were a hit, and people keep asking me about 2013. Let's keep the tradition alive. Robotics companies raised at least $250 Million in 2013, which compares very favorably to the $190 Million from 2012. Also of interest this year: Google snapped up 8 top-notch robotics companies (including Boston Dynamics and Redwood Robotics); Amazon announced its audacious drone delivery plans; Mako Surgical sold for $1.65B and MakerBot was acquired for $600M; a $100M VC Firm was setup to invest specifically in robotics and AI; and we now have our own robotics-specific Exchange Traded Fund (ETF), Robo-Stox (NASDAQ:ROBO)!
A few of my robotics friends (Issei Takino, Huan Liu, and Rosen Diankov of OpenRave fame) quietly started a company in Japan called Mujin that is modernizing old-school manufacturing. Mujin uses modern software and interactive motion controllers to help large manufacturers update their production lines (using decades-old robots) in drastically shorter times compared to what would be required using crummy old motion-jogging panels. By most accounts, this is boring non-flashy robotics work. But that's precisely why it's awesome. All too often it seems like roboticists are fixated on the glitz and glam of building robots, without solving real-world problems. Mujin is solving real, dull robotics problems in a big market with lots of money. I had the opportunity to grill them on their company and the differences between doing business in the US versus Japan -- see below for a Q&A.
Giant 3D printers are cool, but they have fundamental limitations: the parts they can build are limited to the volume of their finite workspace. Let me propose an alternative. What if we gave mobile manipulators a "toolbelt" of rapid prototyping utensils so that our general purpose (home!) robots are effectively on-the-spot machinists capable of building almost any CAD design?! Actually, there's some compelling evidence that we're closer to this vision than one might imagine. In the interim, I think there's some great low-hanging fruit around this idea.... if only there was a diligent robotics researcher (i.e. not me) to pursue it. Any takers?
Over the years I've been keeping an informal list of large rapid prototyping systems. I'd like to take a moment to share some of these, including: big 3-axis systems that print plastic, sand, or cement; large robot arms with extruders and milling bits; and large industrial arms for bending metal and assembling modular structures. This list is woefully incomplete, but it provides some fun eye candy. Enjoy!
Due to the popularity of Hizook's list of VC Funding for Robotics in 2011, we figured folks would be curious how 2012 fared in comparison.... and the news is promising! By our tally, robotics companies raised ~$190 Million (breakdown below) in VC funding in 2012 -- approximately the same amount as in 2011, though it'll probably be more once folks speak up in the comments (please do!). Perhaps more exciting, 2012 was a great year for robotics as an industry: we saw the creation of Grishin Robotics, the first VC fund dedicated exclusively to robotics; several robotics companies were acquired for impressively-high valuations (Kiva for $775 Million, Evolution for $74 Million, and Aldebaran for $100 Million); innumerable crowd-funded robotics campaigns launched new companies; and robotics-specific grants to academia seemed to be on the up (eg. NSF NRI and Darpa M3, ARM, Humanoid programs).
Some people have been asking, "Travis, where did you (and Hizook) disappear to?" Well... I'm taking a prolonged (but ultimately temporary) hiatus from robotics to co-found a new, YCombinator-funded web startup: Lollipuff.com -- an online auction site dedicated exclusively to women's designer clothes and accessories, where every item is authenticated by a team of experts. Surprised? Frankly, I am too. I'll explain more down below... but since this is a robotics website, I figure I have to talk about robotics too. So really, this is two blog posts in one: (1) looking at the intersection of fashion and robotics, and (2) a description of my latest endeavors.
Underactuated robot hands -- with fewer motors than joints -- have been around for decades; however, we've seen a surge of new designs in recent years. Personally, I attribute this trend to the availability of low-cost robot arms and associated open-source software (ie. ROS). In any case, underactuated hands offer numerous advantages in terms of cost, size, weight, and mechanical / electrical complexity while providing a large array of shape-adaptive grasps. In this article, I'd like to introduce you to two new underactuated robot gripers, the Lacquey "Fetch Hand" and the Willow Garage "Velo Gripper." Be sure to check out the photos and videos below.
I'd like to introduce you TechJect (a Georgia Tech spinout) that is building a robot dragonfly based on of years of academic research and $1+ Million in military funding -- aka, one hell of a toy! Early prototypes look pretty compelling (see photos and videos below). Currently, they have an ongoing IndieGoGo crowdfunding campaign that has already raised ~$200k in just a few days, where you can essentially preorder robot dragonflies for about the same price as a quadrotor platform! (Albeit with delivery sometime around September or October of 2013.) Presumably, their 4-wing ornithopter is able to independently control the pitch and amplitude of each wing to perform aggressive aerial maneuvering, which should enable the dragonfly to both fly or hover -- a unique capability compared to most other UAVs (fixed wing or helicopter). Initial product sketches suggest that the dragonflies will be ~6 inches long, weight ~25 grams, and have numerous sensors and connectivity options (ie. RC control, WiFi, IMU, cameras... the usual).
Today, Suitable Technologies announced the "Beam Remote Presence System" -- aka, Beam telepresence robot. You can see the press announcement below, and find plenty of commentary elsewhere online (eg. about how the $16k price tag compares to competitors). Normally, I'd write a similar reaction piece. But instead, I'm going to violate two of my personal rules: I'm going to write a negative piece on Hizook, and I'm going to do it while somewhat mad. Suitable, what were you thinking?!? BEAM robots have been around since the 1990's. Mark Tilden's patent for super-simple analog "nervous nets" to build (often solar powered) "Braitenberg vehicles" dates back to 1992. They have a Wikipedia page; there are several in-print books; there are entire communities and companies built around them. They're the reason I learned electronics and got into robotics as a child. To have the name subverted in this way is sickening. Did no one at Suitable run a simple Google search?! Furthermore, there is no way that "beam" should be eligible for a trademark -- there's too much existing prior use in robotics.
Rethink Robotics (formerly Heartland Robotics) has come out of stealth mode with the announcement of Baxter -- a $22,000 dual-arm, human-scale robot with compliant joints. The details are available in Rethink Robotics' Baxter datasheet and brochure, but here are a few key highlights: dual 7-DoF arms with 5 lb (2.3kg) payload with a max no-load speed of 3.3 ft/sec (1m/sec). The arms are compliant owing to series-elastic actuators (SEAs) with force control and torque sensing at each joint. The robot torso sans-pedistal is 3'1" tall (94cm), and the robot has a reach of roughly 104cm. The robot weighs in at 165 lbs (75kg) and has a suite of sensors including: 5 camers (1 up top, 2 in the torso, and 2 "eye-in-hand"), a 360-deg. ring of ultrasonic range sensors in the head, and IR range sensors at the gripper, and (naturally) kinematics and torque sensing at each joint. Did I mention the starting price of $22,000 and that it starts shipping this October!?! This will be a HUGE deal for robotics. Comparable arms easily cost an order of magnitude more (~$100k each), so getting a full pair for $22k is going to completely change the game -- perhaps even more than the Kinect. It's an exciting time for robotics! Read on for pictures, an interview with Rod Brooks (CTO and co-founder of Rethink), and the press release.
I love Artaic. They're revolutionizing a millennia-old art form (tile mosaics) using dead-simple pick-and-place robots, to create a successful "non-robotics company." Yet in my mind... they're the quintessential "robotics" company. I had cause to visit their headquarters in Boston last March, where I got a special tour by Artaic co-founders Paul Reiss (Creative Director) and Ted Acworth (CEO). Allow me to share their process, some of their beautiful mosaics, their unique outlook on robotics, and a quick sneak peak into advances coming down the pipeline.
Movies and scifi books inspire roboticists to push the envelope, but they've also skewed the public's perception of robot capabilities. This problem is being exacerbated by researchers. In the last three months, I've had to shatter a few dreams: "Your $300 AR.Drone or $150 Ladybird will not be able to perform insane autonomous aerial maneuvers (yet). The UPenn quadrotors rely on $20k-$50k camera-based (Vicon) motion capture systems, which provide global pose estimation of each UAV at millimeter-accuracies at up to 1kHz (and often uses an external, centralized motion planning computer too)." That this crucial aspect of the videos does not register with intelligent people means that researchers are being disingenuous and violating their duty to the public -- which sucks, because their projects and research are awesome! And this is just the example that happens to be most salient to me at the moment. In this post I'd like to explore some "best practices" for robot videos so that we can quit misleading one another.
I have lots of love for Pittsburgh in particular, but it really pisses me off when people on the East Coast repeat a bunch of falsehoods (See #8) about how Boston and Pittsburgh compare to Silicon Valley and the rest of the world. Many people in Pittsburgh and Boston—including people I call friends and mentors—smugly think that the MIT and CMU centered robotics clusters are leading the world in robotics. This is demonstrably false.