Miso scores $10 million to bring its hamburger-flipping robot to more restaurants


Pasadena-based hardware startup Miso Robotics just got a big vote of confidence from investors, in the form of a $10 million Series B. This latest windfall led by Acacia Research Corporation brings the company’s total disclosed funding to $14 million and arrives as it ramps up production and gets ready to deliver its hamburger-cooking robot Flippy to 50 CaliBurger locations.

“We’re super stoked to use this funding to develop and scale our capabilities of our kitchen assistants and AI platform,” CEO/co-founder Dave Zito said on a call with TechCrunch ahead of the announcement. “Our current investors saw an early look at our progress, and they were so blown away that they doubled-down.”

A robot’s view of the grill

The round also includes new investors, including, notably, Levy, a Chicago-based hospitality company that runs restaurants and vending machines in entertainment and sporting venues in the U.S. and U.K. The company’s investment is clearly a strategic one, as it looks toward staffing solutions in its heavily trafficked locations.

“The Levy participation is really centered around their looking at this future world where people are increasingly wanting prepared foods,” says Zito. “People really like the idea of a kitchen assistant that can really come in and be that third hand for the overworked staff. They’re all reporting high turnover rate and increasing customer demand for fresh ingredients prepared quickly. Trying to keep that at accessible prices is hard.”

We’ve already seen a number of demos of Flippy in earlier iterations, including a peek inside the robot’s AI-based vision system back at Disrupt in September. The company promises a more public debut of the robot at the Pasadena CaliBurger location is set to arrive in “the coming weeks.”

[embedded content]
Featured Image: Miso Robotics

Sony now has a Koov robotics learning kit for US classrooms


After soft-launching a blocks-based educational robotics kit in the US last summer to gauge local interest, Sony has judged the reception for its Koov kit warm enough to fire a fully fledged educator offering into the US market. The Koov Educator Kid goes up for pre-order today, with an estimated shipping date of March 25.

The $520 price-tag puts it at the pricier end of the spectrum so far as STEM-targeted learn-to-code gizmos generally go. (And there are a lot of those quasi-educational ‘toys’ for parents to choose from these days.) But, as the name suggests, Sony’s Koov kit is specifically designed for educators and for use in classrooms, with each kit supporting multiple users.

Specifically, each Koov Educator kit is good for “up to five students”, according to Sony — presumably so long as the kids play nice together and fairly share the blocks and bits. So in that context the pricing, while not cheap, looks more reasonable.

Sony is also explicitly targeting ‘STEAM’ learning, with the ‘A’ in the acronym standing for ‘Art’, alongside the more usual Science, Technology, Engineering and Math components, which sets Koov apart from some less flexible learn to code gizmos in the market.

Though other modular electronics coding kits, such as from the likes of LittleBits and Sam Labs (to name two), are also playing in much the same space.

The Koov system is designed for children aged eight and older. And as well as translucent plug together blocks and Arduino-compatible electronics bits, there’s a Scratch-based drag and drop coding interface to link physical creation with digital control, via a cross-platform companion app.

The Educator Kit contains more than 300 connectable blocks in all, plus multiple sensors, motors, LEDs and other electronics bits and bobs (the full inventory is here). It also includes class management software, curriculum-aligned lesson plans, step-by-step guides for kids and student progress reports.

Sony says the Koov app serves up more than 30 hours of educational content, via a Learning Course feature which it says is intended to offer students an introduction to “key concepts” in coding, building and design. As with the majority of these STEM gizmos, the educational philosophy leans heavily on the idea of learning through playing (around).

The Koov kit also includes 23 pre-designed, pre-coded “Robot Recipes” to encourage kids to get building right away. Though the wider aim of the Koov system is to support children being able to design and build their own robots (back to that ‘Art’ element) — and indeed Sony claims there are “countless” ways to stick its blocks and bits together. So much like Lego, then.

It also bills the system as “flexible enough” for students to use independently while also providing material to support structured learning too.

Boston Dynamics’ newest robot learns to open doors


We knew this day would come sooner or later. Like the cloned velociraptors before it, Boston Dynamics’ newly redesigned Spot Mini has figured out how to open doors — with either its arm or face, depending on how you look at it.

The team behind the Big Dog proves that it’s still the master of viral robotic marketing, even after switching teams from Google to Softbank. Three months after debuting a more streamlined version of its electronic Spot Mini, the company’s got another teaser wherein one robot equipped with a head-mounted arm makes (relatively) quick work of a door, letting his his pal waltz through.

[embedded content]

The video’s impressive for both the agility of the arm itself, as well as the robot’s ability to maintain balance as it swings open what looks to be a fairly heavy door.

“Clever girl,” indeed.

Like the last video, the teaser doesn’t offer a ton of insight into what’s new with the bumble bee colored version of the company’s already announced robot. Last time out, it appeared as though we got a preview of a pair of Kinect-style 3D cameras that could give a little more insight into the robot’s navigation system.

That tech seemed to hint at the possibility of an advanced autonomous control system. Given the brevity of the video, however, it’s tough to say whether someone’s controlling the ‘bots just out of frame.

If the company managed program Spot Mini to actually open the door on its own in order to help free its friend, well, perhaps it’s time to be concerned.

Teaching robots to understand their world through basic motor skills


Robots are great at doing what they’re told. But sometimes inputting that information into a system is a far more complex process than the task we’re asking them to execute. That’s part of the reason they’re best suited for simple/repetitive jobs.

A team of researchers at Brown University and MIT is working to develop a system in which robots can plan tasks by developing abstract concepts of real-world objects and ideas based on motor skills. With this system, the robots can perform complex tasks without getting bogged down in the minutia required to complete them.

The researchers programmed a two-armed robot (Anathema Device or “Ana”) to manipulate objects in a room — opening and closing a cupboard and a cooler, flipping on a light switch and picking up a bottle. While performing the tasks, the robot was taking in its surroundings and processing information through algorithms developed by the researchers.

[embedded content]

According to the team, the robot was able to learn abstract concepts about the object and the environment. Ana was able to determine that doors need to be closed before they can be opened.

“She learned that the light inside the cupboard was so bright that it whited out her sensors,” the researchers wrote in a release announcing their findings. “So in order to manipulate the bottle inside the cupboard, the light had to be off. She also learned that in order to turn the light off, the cupboard door needed to be closed, because the open door blocked her access to the switch.”

Once processed, the robot associates a symbol with one of these abstract concepts. It’s a sort of common language developed between the robot and human that doesn’t require complex coding to execute. This kind of adaptive quality means the robots could become far more capable of performing a greater variety of tasks in more diverse environments by choosing the actions they need to perform in a given scenario.

“If we want intelligent robots, we can’t write a program for everything we might want them to do,” George Konidaris, a Brown University assistant professor who led the study told TechCrunch. “We have to be able to give them goals and have them generate behavior on their own.”

Of course, asking every robot to learn this way is equally inefficient, but the researchers believe they can develop a common language and create skills that could be download to new hardware.

“I think what will happen in the future is there will be skills libraries, and you can download those,” explains Konidaris. “You can say, ‘I want the skill library for working in the kitchen,’ and that will come with the skill library for doing things in the kitchen.”

Researchers discovered a new kind of stereo vision by putting tiny 3D glasses on mantises


Researchers at Newcastle University, U.K. believe they’ve discovered a differently evolved form of stereo vision in mantises. The research team studied the phenomenon in the insects precisely as one would hope — by attaching a pair of tiny 3D glasses to their bug eyes.

The scientists attached a mantis-sized pair of dual-color 3D glasses to the insects’ eyes, using beeswax as temporary adhesive. The team then showed video of potential prey, which the mantises lunged at. In that respect, the bugs appeared to approach 3D image processing in much the same way humans do.

[embedded content]

But when the insects were shown dot patterns used to test 3D vision in humans, they reacted differently. “Even if the scientists made the two eyes’ images completely different,” the university writes, describing its findings, “mantises can still match up the places where things are changing. They did so even when humans couldn’t.”

According to the university, the discovery of stereo vision makes the mantises unique in the insect world. The manner of 3D vision is also different than the variety developed in other animals like monkeys, cats, horses, owls and us. In this version of the trait, the mantises are matching motion perceived between two eyes, rather than the brightness we use.

“We don’t know of any other kind of animal that does this,” Dr. Vivek Nityananda told TechCrunch. “There’s really no other example or precedent we have for this kind of 3D vision.” Nityananda adds that this sort of 3D vision has been theorized in the past, but the team believes that this is the first time it’s been detected in the animal kingdom.

The scientists believe the system was developed in a way that’s much less complex than our version of 3D vision, in order to be processed by the mantises’ less complex brains. That, Nityananda says, could be a boon for roboticists looking to implement 3D systems into less complex and lighterweight machines.

“It’s a simpler system,” he says. “All mantises are doing is detecting the change in the relevant position in both eyes. Detection of change would be much easier to implement [in robotics], versus the more elaborate details in matching the views of two eyes. That would require much less computation power, and you could put that into perhaps a much more lightweight robot or sensor.”

Aurora will power Byton EV’s autonomous driving features


Aurora, the self-driving startup founded by Google self-driving car project alum Chris Urmson, along with Tesla Autopilot developer Sterling Anderson, CMU robotics expert and Uber vet Drew Bagnell, and a team of industry experts, will be making the autonomous smarts for Byton’s forthcoming electric vehicle. Byton, a startup that had a splashy debut at CES earlier this year.

Byton’s Concept electric SUV is a car with a lot of interesting tech features, aside from its all-electric drive train. The vehicle has a massive, dashboard-covering display that incorporates information readouts, entertainment options and vehicle controls. It’s a screen that seems somewhat ill-suited for the task of paying attention to the road while driving, and the Byton car also has front seats that swivel towards the inside of the vehicle so that those in the front can better interact with those in the back.

Both of those features are more geared toward a future in which autonomous driving is a ready and viable option for Byton owners. The car is aiming for a 2019 starting ship date, by which time it’s possible self-driving features won’t seem such a distant dream. And now we know that Byton has a technology partners on the autonomous driving side of things with the technical know-how to make it an even more realistic expectation.

Aurora, despite officially breaking cover only just last year, is already working with a range of automakers on their autonomous driving technology, including Volkswagen and Hyundai. Aurora CEO Chris Urmson explained that its goals mean it’s happy to work with companies at all stages of development and maturity to help make self-driving a practical reality.

“Our mission is to deliver the benefits of self-driving technology safety, quickly and broadly,” he said in n interview. “So for us to have that broad part, it means we have to work with a nudger of great partners, and we’re very fortunate with the folks we have [as partners] to date… this is how we help the business, and we look forward to being able to engage with others in the future.”

For Byton and Aurora, this partnership will kick off with pilot test driving in California sometime soon,  and Byton hopes to eventually tap Aurora with its goal of fielding premium electric consumer vehicles with SAE Level 4 and Level 5 autonomous capabilities.

Aurora as a company is excited about its progress during its first year in operation, and is ramping up staffing and attracting key talent in a very competitive industry thanks to its pedigree and founding team, Urmson tells me.

“It started with a handful of us, a couple in my living room here in California, and a couple in Pittsburgh. We’ve been growing the team, that’s been one of the core focuses of this last year,” he said. “In my previous gig I had the privilege of helping build that program from day one, to a massive organization certainly leading the space, and now with Sterling and Drew, we have the opportunity to build version two of that, and learn from our experience, and build an organization and build a technology that can have a huge impact on the world, and do that quickly and safely.”

[embedded content]

Aurora will power Byton EV’s autonomous driving features


Aurora, the self-driving startup founded by Google self-driving car project alum Chris Urmson, along with Tesla Autopilot developer Sterling Anderson, CMU robotics expert and Uber vet Drew Bagnell, and a team of industry experts, will be making the autonomous smarts for Byton’s forthcoming electric vehicle. Byton, a startup that had a splashy debut at CES earlier this year.

Byton’s Concept electric SUV is a car with a lot of interesting tech features, aside from its all-electric drive train. The vehicle has a massive, dashboard-covering display that incorporates information readouts, entertainment options and vehicle controls. It’s a screen that seems somewhat ill-suited for the task of paying attention to the road while driving, and the Byton car also has front seats that swivel towards the inside of the vehicle so that those in the front can better interact with those in the back.

Both of those features are more geared toward a future in which autonomous driving is a ready and viable option for Byton owners. The car is aiming for a 2019 starting ship date, by which time it’s possible self-driving features won’t seem such a distant dream. And now we know that Byton has a technology partners on the autonomous driving side of things with the technical know-how to make it an even more realistic expectation.

Aurora, despite officially breaking cover only just last year, is already working with a range of automakers on their autonomous driving technology, including Volkswagen and Hyundai. Aurora CEO Chris Urmson explained that its goals mean it’s happy to work with companies at all stages of development and maturity to help make self-driving a practical reality.

“Our mission is to deliver the benefits of self-driving technology safety, quickly and broadly,” he said in n interview. “So for us to have that broad part, it means we have to work with a nudger of great partners, and we’re very fortunate with the folks we have [as partners] to date… this is how we help the business, and we look forward to being able to engage with others in the future.”

For Byton and Aurora, this partnership will kick off with pilot test driving in California sometime soon,  and Byton hopes to eventually tap Aurora with its goal of fielding premium electric consumer vehicles with SAE Level 4 and Level 5 autonomous capabilities.

Aurora as a company is excited about its progress during its first year in operation, and is ramping up staffing and attracting key talent in a very competitive industry thanks to its pedigree and founding team, Urmson tells me.

“It started with a handful of us, a couple in my living room here in California, and a couple in Pittsburgh. We’ve been growing the team, that’s been one of the core focuses of this last year,” he said. “In my previous gig I had the privilege of helping build that program from day one, to a massive organization certainly leading the space, and now with Sterling and Drew, we have the opportunity to build version two of that, and learn from our experience, and build an organization and build a technology that can have a huge impact on the world, and do that quickly and safely.”

[embedded content]