Oracle to expand automation capabilities across developer cloud services


Last fall at Oracle OpenWorld, chairman Larry Ellison showed he was a man of the people by comparing the company’s new autonomous database service to auto-pilot on his private plane. Regardless, those autonomous capabilities were pretty advanced, providing customers with a self-provisioning, self-tuning and self-repairing database. Today, Oracle announced it was expanding that automation beyond the database to other parts of its developer cloud platform.

The company started with that autonomous database, known by the exciting name, 18C, which like Ellison’s airplane practically runs itself. “We are extending the automation across all of our cloud platform services, making them self driving, self securing and self repairing and eliminating human requirements to handle all of the [installation], protection and services,” Amit Zavery, executive vice president for the Oracle Cloud Platform told TechCrunch.

The automation will be applied to a broad array of Oracle cloud services including applications development, data integration and security. The new services are designed to remove a significant amount of the complexity and reduce the time and cost associated with launching, running and maintaining cloud services. The goal is to leave it to the machine wherever possible.

Developers still need to do their jobs, but it drastically reduces much of the day-to-day operations and initial tasks, which should increase the efficiency of the IT team, Zavery said. “The time to market, risk and cost come down. The mundane tasks go out of your hands and you can spend more time on the application you want to build,” he explained.

This automation uses a lot of artificial intelligence and machine learning under the hood and should speed up the transition to the cloud for Oracle’s customers. What’s more, the intelligence layer means that technology should improve over time as it learns the intricacies of each customer’s individual requirements.

Ellison founded Oracle in the late 1970s in a very different computing world. Over the last several years, the company has been transitioning to a cloud model, but it was very late to the game and far behind companies like Amazon, Microsoft, Google, IBM and even Alibaba. Zavery sees this level of automation as a key differentiator between Oracle and its cloud competitors.

The new autonomous services will be rolling out over the first half of this year, Zavery said.

Featured Image: Bloomberg/Getty Images

Aurora will power Byton EV’s autonomous driving features


Aurora, the self-driving startup founded by Google self-driving car project alum Chris Urmson, along with Tesla Autopilot developer Sterling Anderson, CMU robotics expert and Uber vet Drew Bagnell, and a team of industry experts, will be making the autonomous smarts for Byton’s forthcoming electric vehicle. Byton, a startup that had a splashy debut at CES earlier this year.

Byton’s Concept electric SUV is a car with a lot of interesting tech features, aside from its all-electric drive train. The vehicle has a massive, dashboard-covering display that incorporates information readouts, entertainment options and vehicle controls. It’s a screen that seems somewhat ill-suited for the task of paying attention to the road while driving, and the Byton car also has front seats that swivel towards the inside of the vehicle so that those in the front can better interact with those in the back.

Both of those features are more geared toward a future in which autonomous driving is a ready and viable option for Byton owners. The car is aiming for a 2019 starting ship date, by which time it’s possible self-driving features won’t seem such a distant dream. And now we know that Byton has a technology partners on the autonomous driving side of things with the technical know-how to make it an even more realistic expectation.

Aurora, despite officially breaking cover only just last year, is already working with a range of automakers on their autonomous driving technology, including Volkswagen and Hyundai. Aurora CEO Chris Urmson explained that its goals mean it’s happy to work with companies at all stages of development and maturity to help make self-driving a practical reality.

“Our mission is to deliver the benefits of self-driving technology safety, quickly and broadly,” he said in n interview. “So for us to have that broad part, it means we have to work with a nudger of great partners, and we’re very fortunate with the folks we have [as partners] to date… this is how we help the business, and we look forward to being able to engage with others in the future.”

For Byton and Aurora, this partnership will kick off with pilot test driving in California sometime soon,  and Byton hopes to eventually tap Aurora with its goal of fielding premium electric consumer vehicles with SAE Level 4 and Level 5 autonomous capabilities.

Aurora as a company is excited about its progress during its first year in operation, and is ramping up staffing and attracting key talent in a very competitive industry thanks to its pedigree and founding team, Urmson tells me.

“It started with a handful of us, a couple in my living room here in California, and a couple in Pittsburgh. We’ve been growing the team, that’s been one of the core focuses of this last year,” he said. “In my previous gig I had the privilege of helping build that program from day one, to a massive organization certainly leading the space, and now with Sterling and Drew, we have the opportunity to build version two of that, and learn from our experience, and build an organization and build a technology that can have a huge impact on the world, and do that quickly and safely.”

[embedded content]

Aurora will power Byton EV’s autonomous driving features


Aurora, the self-driving startup founded by Google self-driving car project alum Chris Urmson, along with Tesla Autopilot developer Sterling Anderson, CMU robotics expert and Uber vet Drew Bagnell, and a team of industry experts, will be making the autonomous smarts for Byton’s forthcoming electric vehicle. Byton, a startup that had a splashy debut at CES earlier this year.

Byton’s Concept electric SUV is a car with a lot of interesting tech features, aside from its all-electric drive train. The vehicle has a massive, dashboard-covering display that incorporates information readouts, entertainment options and vehicle controls. It’s a screen that seems somewhat ill-suited for the task of paying attention to the road while driving, and the Byton car also has front seats that swivel towards the inside of the vehicle so that those in the front can better interact with those in the back.

Both of those features are more geared toward a future in which autonomous driving is a ready and viable option for Byton owners. The car is aiming for a 2019 starting ship date, by which time it’s possible self-driving features won’t seem such a distant dream. And now we know that Byton has a technology partners on the autonomous driving side of things with the technical know-how to make it an even more realistic expectation.

Aurora, despite officially breaking cover only just last year, is already working with a range of automakers on their autonomous driving technology, including Volkswagen and Hyundai. Aurora CEO Chris Urmson explained that its goals mean it’s happy to work with companies at all stages of development and maturity to help make self-driving a practical reality.

“Our mission is to deliver the benefits of self-driving technology safety, quickly and broadly,” he said in n interview. “So for us to have that broad part, it means we have to work with a nudger of great partners, and we’re very fortunate with the folks we have [as partners] to date… this is how we help the business, and we look forward to being able to engage with others in the future.”

For Byton and Aurora, this partnership will kick off with pilot test driving in California sometime soon,  and Byton hopes to eventually tap Aurora with its goal of fielding premium electric consumer vehicles with SAE Level 4 and Level 5 autonomous capabilities.

Aurora as a company is excited about its progress during its first year in operation, and is ramping up staffing and attracting key talent in a very competitive industry thanks to its pedigree and founding team, Urmson tells me.

“It started with a handful of us, a couple in my living room here in California, and a couple in Pittsburgh. We’ve been growing the team, that’s been one of the core focuses of this last year,” he said. “In my previous gig I had the privilege of helping build that program from day one, to a massive organization certainly leading the space, and now with Sterling and Drew, we have the opportunity to build version two of that, and learn from our experience, and build an organization and build a technology that can have a huge impact on the world, and do that quickly and safely.”

[embedded content]

Continental taps Nvidia for its full-scale autonomous vehicle platform


Continental is the latest top-tier automotive supplier to work with Nvidia, and the latest to announce its intent to build a full-scale, top-to-bottom autonomous driving system. Continental (which might be most familiar from its tire division, but which supplies a range of automotive parts and systems across the industry) will be using Nvidia’s DRIVE autonomous vehicle platform for its system, and hopes to bring them to market by 2021.

The Continental AV systems will work together using dedicated engineering teams provided by both sides, and with base technologies underlying the Continental offering that include Nvidia’s DRIVE Xavier system-on-a-chip, DRIVE OS and its DRIVE AV software offerings, the companies announced, and Continental will supply ASIL-D security certification expertise, as well as radar, camera and LiDAR sensor solutions.

Continental’s offering aims to provide automakers and other prospective clients with autonomy from Levels 2 through 5, which spans advanced cruise control features similar to Tesla’s Autopilot, Nissan’s PROPilot and GM’s SuperCruise, all the way up to true self-driving without so much as a steering wheel or a gas/brake pedal in sight.

Nvidia’s partnerships already span the industry and include many top-tier OEMs and suppliers, but Continental is still a formidable addition to the mix, and one that will help with its long-term positioning in the autonomous vehicle market.

Foxconn to plug at least $340M into AI R&D over five years


Manufacturing giant Foxconn has said it will make a major investment in artificial intelligence-based R&D as it looks for new business growth opportunities in a cooling global smartphone market, Nikkei reports.

“We will at least invest some 10 billion New Taiwan dollars ($342M) over five years to recruit top talent and deploy artificial intelligence applications in all the manufacturing sites,” said chairman Terry Gou.

“It’s likely that we could even pour in some $10BN or more if we find the deployments are very successful or can really generate results.”

Gou added that the ambition is to become “a global innovative AI platform rather than just a manufacturing company”.

Data put out this week by Strategy Analytics records a 9 per cent fall in global smartphone shipments in Q4 2017 — the biggest such drop in smartphone history — which the analyst blames on the floor falling out of the smartphone market in China.

“The shrinkage in global smartphone shipments was caused by a collapse in the huge China market, where demand fell 16 percent annually due to longer replacement rates, fewer operator subsidies and a general lack of wow models,” noted Strategy Analytics’ Linda Sui in a statement.

On a full-year basis, the analysts records global smartphone shipments growing 1 percent — topping 1.5 billion units for the first time.

But there’s little doubt the smartphone growth engine that’s fed manufacturing giants like Foxconn for so long is winding down.

This week, for example, Apple — Foxconn’s largest customer — reported a dip in iPhone sales for the holiday quarter. Though Cupertino still managed to carve out more revenue (thanks to that $1k iPhone X price-tag). But those kind of creative pricing opportunities aren’t on the table for electronics assemblers. So it’s all about utilizing technology to do more for less.

According to Nikkei, Foxconn intends to recruit up to 100 top AI experts globally. It also said it will recruit thousands of less experienced developers to work on building applications that use machine learning and deep learning technologies.

Embedding sensors into production line equipment to capture data to feed AI-fueled automation development is a key part of the AI R&D plan, with Foxconn saying earlier that it wants to offer advanced manufacturing experiences and services — eyeing competing with the likes of General Electric and Cisco.

The company has also been working with Andrew Ng’s new AI startup Landing.ai — which is itself focused on plugging AI into industries that haven’t yet tapping into the tech’s transformative benefits, with a first focus on manufacturing — since July.

And Gou confirmed the startup will be a key partner as Foxconn works towards its own AI-fueled transformation — using tech brought in via Landing.ai to help transform the manufacturing process, and identify and predict defects.

Quite what such AI-powered transformation might mean for the jobs of hundreds of thousands of humans currently employed by Foxconn on assembly line tasks is less clear. But it looks like those workers will be helping to train AI models that could end up replacing their labor via automation.

Featured Image: Matt Wakeman/Flickr UNDER A CC BY 2.0 LICENSE

We were in an accident during an automated driving tech demo

[embedded content]

As a transportation reporter, I’ve been in a lot of cars using either autonomous or semi-automated Advanced Driver Assistance (ADAS) features, both on public and private roads, and yet I’d never been in an accident during a demonstration drive for any of these features before now.

On Tuesday, January 30, myself and two TechCrunch video production staff were riding in a modified Hyundai Genesis equipped with technology created by autonomous systems startup Phantom AI, traveling south on Bayshore Freeway near Millbrae, CA.

In addition to the three TechCrunch staff, there were also two Phantom AI team members in the car, including founder and CEO Hyunggi Cho, and President and co-founder Chan Kyu Lee. Lee was at the wheel during the test, which was a demonstration of the startup’s Autopilot-like SAE Level 2 partial autonomous system, designed to maintain lane position, maintain distance between itself and vehicles ahead, and switch lanes automatically once directed using the turning indicator.

During the demonstration, while the L2 system was engaged and we were traveling at 60 MPH according to Phantom (though the human-machine interface in the video above shows 70 MPH as our set cruising speed), a pickup truck ahead dropped a poorly secured garbage bin from its cargo bed onto the roadway. The car in front, a white Nissan Rogue, applied the breaks to prevent hitting the trash bin, and our driver noticed the sudden stop and applied the brake in the Genesis with force to attempt to prevent a collision, but there was very little he could do at that stage and our car collided with the Nissan going approximately 20 MPH.

The Genesis suffered significant damage to the front end, as you can see above, though it was still technically drivable (but did begin leaking radiator fluid) and we first pulled to the shoulder as instructed by the California Highway Patrol, who were on the scene nearly immediately. We were then directed by the Patrol to a Chevron station just beyond a nearby exit, where we parked on the shoulder of a turnaround and gave our contact information to a patrolman, while the drivers involved provided full statements to the authorities.

Cho told us that the Automatic Emergency Braking system would’ve normally engaged at that point and prevented the collision, but it was disabled for the demo because it had been throwing too many false positives and was undergoing tuning. Lee’s manual disengagement, which resulted from pressing the brake pedal to the floor, as you can see in the video, didn’t occur fast enough for us to shed all our speed.

Cho immediately made sure no one was hurt (we weren’t, just shaken up) and after being cleared to leave the scene, we decided to follow through and move to an area of Foster City, where we rode in the startup’s Level 4 autonomous test car (a separate Genesis) around sparsely populated residential streets without incident. That may seem like an odd choice given we’d just experienced an accident riding in a car equipped with a different version of the company’s technology, but the follow-up demo was a separate system, speeds were much lower and there was a different safety driver at the wheel, and we were also honestly just all a little rattled and confused following the collision.

We provided Phantom AI with the opportunity to supply a full account of what occurred from their perspective, based on reviewing the data from the vehicle, and they supplied a breakdown of what occurred from their perspective, which you can read in full below. The video at the top of this article also documents the crash from within the vehicle, using a Rylo 360-degree camera. Footage was trimmed for length and exported to 1080P frames from a 4K 360-degree source automatically using Rylo’s software, and the license plate of the vehicle hit was obscured for privacy, but otherwise the footage is unedited.

We were conducting a demo of the Adaptive Cruise Control (ACC) feature, and Lane Keeping Assist (LKA) feature of our L2 system.

Our Automatic Emergency Braking (AEB) functionality was disabled for demo purpose.

For a frame of reference we’ll use the moment of impact at t, and express the time line in t minus x seconds.

t minus 6: 50m apart from the front car; our car was cruising at 60MPH with L2’s Cruise Control feature active

t minus 6: A trash can from the pickup truck 2 or 3 cars apart started flying off the trunk

t minus 5: Our driver realized that something was up as the car in front abruptly decelerates

t minus 4 to t minus 3: Our driver began taking over the L2 system; braking started and our L2 system was disengaged (can be clearly seen in the HMI display)

t minus 3 to t: Braking was applied as strong as possible, with braking force : 5~6m/s^2 (you can see from the speedometer we went from 60MPH -> 25MPH for 3 seconds). As you can see the driver was doing everything he could within these 3 seconds, expecting an impact.

t minus 0: impact at ~20MPH

So our driver could’ve clearly had a slightly faster reaction time, applied brake 1-1.5 seconds earlier, and braked harder since the maximum possible braking of a vehicle is about 8m/s^2.
But at that instant since he had you, a reporter in the car, he was clearly overwhelmed and the anxiety caused the slight delay in reaction time for decision making.

In hindsight we should’ve enabled our emergency braking functionality, as this would’ve been a rare opportunity to test/demonstrate our automated braking system which would’ve braked harder than the human driver. What happened was an unfortunate minor incident that we had a human driving error which had we enabled our full system, could have been avoided.

Nuro’s self-driving vehicle is a grocery-getter and errand-runner


Not every self-driving car has to be able to move passengers from point A to point B. Take, for example, Nuro: The startup just revealed their unique autonomous vehicle platform, which is more of a mobile small logistics platform than a self-driving car.

The company, which has been working away in stealth mode in Mountain View until now, has raised a $92 million Series A round led by Banyan Capital and Greylock Partners to help make its unique vision of autonomous transport take shape.

Nuro’s vehicle is a small, narrow box on wheels, which is about half the width of a regular car, and which is designed to be a lightweight way to get goods from a local business to a customer, or from one person to another within a neighborhood or city. The platform is just one example of what Nuro wants to do, however; the startup bills itself as a product company focused on bringing “the benefits of robotics” to everyday use and ordinary people.

Nuro’s AV also operates completely autonomously, and looks like something you’d see on a Moon base in a retro-futuristic sci-fi show. There’s a pin pad for user interaction, so that only the right customer can access the contents stored within, and a top-mounted sensor array that includes LiDAR, optical cameras and radar (other sensors are located around the vehicle to enable its autonomous driving).

The young startup’s goal is to partner with businesses to set up transportation services. You can easily imagine this slotting in nicely to something like Uber Eats, and bringing food from the local lunch spot to offices around where people are hungry but can’t make the trip out to their usual places in person. Or, these could support Amazon’s last mile needs for in-city delivery, for example. Nuro isn’t yet talking about specific partnerships, however.

This fit-for-purpose vehicle and dedicated focus could help Nuro accomplish some of the vision that Ford has for its AV program, for instance, with potentially fewer barriers to deployment in limited markets and specifically bounded environments. It’s still early days for the startup, however, and it’s also competing in some ways with more established young companies like Starship Robotics. Still, it’s a neat first product and an interesting vision.