Donate to Remove ads

Got a credit card? use our Credit Card & Finance Calculators

Thanks to johnstevens77,Bhoddhisatva,scotia,Anonymous,Cornytiv34, for Donating to support the site

Musk endeavours

The Big Picture Place
BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#178810

Postby BobbyD » November 7th, 2018, 4:09 pm

dspp wrote:BD,

Re starting with the right amount of sensors I humbly disagree. Starting with far too much will simply cause dev team distraction, rework, and increase cost and time. To use an extreme example lets say a warship could have a radar, a sonar, and a opto-tracker; and the job is to create a AI-based short-range missile defense system. You'd start by ignoring the sonar data as it will simply be irrelevant for 99% of the short-range situations, and integrating it would be a very challenging job. So instead you observe that you can get a viable solution with just the two and crack on.


I think one thing we can definitely agree on is that we disagree...

Let me restate my argument. The important thing here from my point of view is the achievement. If you can achieve full AD, even if you do it by strapping an AWACS to your roof and towing a small server farm behind you then you are breaking important ground. You've got your chimp.He might be a very large very heavy chimp with significant headroom issues, but he is a chimp. Computer power is a matter of time and money, it's known tech,l and once you have a working chimp you can see how it reacts to having bits shaved off. I'm not suggesting using the kitchen sink, but the progress made by companies using LIDAR would indicate to me that at the moment LIDAR confers a significant advantage in achieving monkey status, and if you are putting your experimental system out in to the real world with real innocent bystanders you should know that it is as safe as you could make it which you can't know if you haven't tried variations which might make it safer.

dspp wrote:That tome as you call it is a real goldmine, and I spent a few hours wading through it and the various links yesterday. As best I can see the market leaders in autonomous vehicle systems are Waymo (Google/Alphabet), MobilEye (Intel), and Tesla, plus the wildcard of Apple and various Chinese. There are many interesting things about the scene, including Intel's assessment that they can sell each kit for a few thousand $ (sensors + compute + software + data services) in return for which the OEMs add $5k to the price tag; and the issues behind the MobilEye / Tesla split (which is far more complex than you suggest, imho).


I wasn't aware tome was derogatory... it is interesting, but also massive, and only partially consumed.

I'd throw in APTIV (formerly Delphi), and remove Tesla from that list, and call it leaders in autonomous cars rather than vehicles.

The idea of out of the box AD was proposed by Delphi a few years ago, with a predicted arrival at the factory gates of 2019, and shop floor presence a couple of years later. At the time they were working closely with MobilEye, and Intel were later introduced as a chip supplier before they went on to buy MobilEye.

The MobilEye/Tesla split comes down to who you believe... Maybe it was inevitable in the medium term, but I find MobilEye credible, and think that the reputational damage continued association with Tesla could have done them was too big a risk.

dspp wrote:Interestingly MobilEye/Intel have now realised that they cannot ignore L2+ as a pathway and so they have 15-20 or so projects that are beginning to come to market to deliver a comparable functionality (e.g. GM Cadillac Super Cruise is a MobilEye system). O


I don't think MobilEye ever discounted L2 as a product market, afterall Tesla was MobilEye, and I think you'll find them present in most non-Tesla L2 systems with optics.

dspp wrote:What will be interesting is how easy it is to get from L2+ to L4/5


Agreed, although I think it is at best the scenic route to L4/5.

dspp wrote:Remember of the three Tesla fatalaties everyone keeps on going on about. At least two of those were running MobilEye + Tesla on HW1. I'm not sure about the third. The first may or may not have been switched on so unclear. https://en.wikipedia.org/wiki/List_of_s ... fatalities


My point regarding the fatalities was just about the reputational harm it did AD, when AD was considerably more vulnerable than it is now, although it still lingers very strongly in the perception of people with no interest in the subject, which takes me back to why I believe MobilEye's rational for severing ties with Tesla.

dspp wrote:What is interesting is that everything coming to market in the 2019-2020 period running L2 / L2+ will be doing so without an integrated LIDAR as far as I can see. There is no industry consensus on whether a LIDAR is required for L4/L5.


As far as I'm aware there is a fairly strong consensus that LIDAR is essential for L4/5 as things stand.

dspp
Lemon Half
Posts: 5884
Joined: November 4th, 2016, 10:53 am
Has thanked: 5825 times
Been thanked: 2127 times

Re: Musk endeavours

#178824

Postby dspp » November 7th, 2018, 5:14 pm

BD,

Tome not intended to be derogatory. It is a very large goldmine and my search keeps expanding in autonomous vehicle (AV) space.

We know two lense stereoscopic vision is enough for meat. So eight lense stereo is overkill, especially when backed up with radar (now onto a v3 from what I read), and the ultrasonics. I've not noted any humans with built-in LIDAR. I'm not saying that LIDAR might not add value, just that the jury is out. ME for sure are not using it.

I think the ME/Tesla split was coming long before it happened. However I think it suited ME for Tesla to adopt them, up until Intel bought them. But basically from the get go I think they were both using the other to learn enough to escape from the relationship. Why else would ME refuse Tesla raw vision. Why else was Tesla running their own system in parallel from well before the bifurcation fatality incident. I think the timing of the actual split came as a surprise to both parties, as it was probably when the bosses in Intel leaned in and said our reputation is worth too much. For sure it probably set Tesla back about a year or more whilst they kludged their own software to run on hardware for AP2. They now appear to be getting very good results from HW2.5 and v9, with HW3 in test (prob for an S/X update and the Y is my guess).

You may find http://techcastdaily.com/2018/10/29/int ... -10-29-18/ of interest. It shows Tesla are no longer just in copycat fast follower mode.

How many players there really are is a bit of a pick & mix. Some are component suppliers, some system suppliers, some system integrators. Some all of this, some different combos. To an extent this arena reminds me of smartphone ecosystems back in 2000 or so. We don't know whether this industry is going horizontal, or vertical, or what. Once at L4 / L5 AV + EV then what added value really is coming from the OEM that is not originating from the battery/pack manufacturer, plus the charger network operators, plus the AV suppliers ?? You kinda wonder if you might not see many "coachbuilders" but only a few full-system internals/underpinnings system providers. It is rather disruptive .......

I'm not rushing to make dedicated AV bets in this area for sure. I'm very glad I can see other value in Tesla apart from AV, but still it is high risk in my portfolio and so only about 1% or so.

regards, dspp

odysseus2000
Lemon Half
Posts: 6364
Joined: November 8th, 2016, 11:33 pm
Has thanked: 1530 times
Been thanked: 959 times

Re: Musk endeavours

#178826

Postby odysseus2000 » November 7th, 2018, 5:23 pm

BobbyD
Let me restate my argument. The important thing here from my point of view is the achievement. If you can achieve full AD, even if you do it by strapping an AWACS to your roof and towing a small server farm behind you then you are breaking important ground. You've got your chimp.He might be a very large very heavy chimp with significant headroom issues, but he is a chimp. Computer power is a matter of time and money, it's known tech,l and once you have a working chimp you can see how it reacts to having bits shaved off. I'm not suggesting using the kitchen sink, but the progress made by companies using LIDAR would indicate to me that at the moment LIDAR confers a significant advantage in achieving monkey status, and if you are putting your experimental system out in to the real world with real innocent bystanders you should know that it is as safe as you could make it which you can't know if you haven't tried variations which might make it safer.


We have had this discussion before.

Your arguments are straight of academia: Give us all the money we ask for, give us all the best equipment, give us as long as it takes and we will see if we can make this work in a form that is not practical and which is too expensive to be commercialised.

In the commercial world you only get money if you can demonstrate some return or very likely near future return, you have deadlines, you have limits on equipment and personnel etc etc. If you don't perform you get removed from the job. All of the robotic driver teams will be under intense pressure to make this work asap.

As Brunel put it: "Engineering is doing for six pence what any fool can do for a schilling."

The argument that you putting innocents at risk also does not stand up. Currently 10 people are being killed or seriously injured just on UK roads EVERY day. We need a solution to reduce this as fast as possible and that will have to involve risk just as railways, canals and early motor transport lead to deaths and injuries. It would be great if they could be avoided and everyone hopes they are, but we could all benefit from safer roads.

Regards,

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#178886

Postby BobbyD » November 7th, 2018, 11:22 pm

dspp wrote:We know two lense stereoscopic vision is enough for meat.


If you are building a meatware based AI I can see the relevance... otherwise not so much.*

dspp wrote:
I think the ME/Tesla split was coming long before it happened. However I think it suited ME for Tesla to adopt them, up until Intel bought them. But basically from the get go I think they were both using the other to learn enough to escape from the relationship. Why else would ME refuse Tesla raw vision. Why else was Tesla running their own system in parallel from well before the bifurcation fatality incident. I think the timing of the actual split came as a surprise to both parties, as it was probably when the bosses in Intel leaned in and said our reputation is worth too much.


Intel didn't own MobilEye at that point, although we appear to have ended up agreeing that the timing of the split atleast was designed to protect MobilEye's reputation.

dspp wrote:How many players there really are is a bit of a pick & mix. Some are component suppliers, some system suppliers, some system integrators. Some all of this, some different combos. To an extent this arena reminds me of smartphone ecosystems back in 2000 or so. We don't know whether this industry is going horizontal, or vertical, or what. Once at L4 / L5 AV + EV then what added value really is coming from the OEM that is not originating from the battery/pack manufacturer, plus the charger network operators, plus the AV suppliers ?? You kinda wonder if you might not see many "coachbuilders" but only a few full-system internals/underpinnings system providers. It is rather disruptive .......


That was my initial assumption way back in about 2015/6... as things progress there are turning out to be more potentially viable systems out there. Good for the future of AD, makes picking a winner a bit more complicated though...

odysseus2000 wrote:We have had this discussion before.

Your arguments are straight of academia: Give us all the money we ask for, give us all the best equipment, give us as long as it takes and we will see if we can make this work in a form that is not practical and which is too expensive to be commercialised.


My arguments are consistent with the approach of people who have functioning L4/5 systems... all of whom are commercial companies looking to gain a return on their investment. If it's a matter of sensor cost that will crash when you have full scale L5 car production on every continent, if it's a matter of processing power computer chips keep getting faster and cheaper. The only bit which is magic by today's standards is the Chimpiness of your code, and you can't buy Chimpiness.

* Never taken a car to pieces, have taken a few brains to pieces in the distant past though.

odysseus2000
Lemon Half
Posts: 6364
Joined: November 8th, 2016, 11:33 pm
Has thanked: 1530 times
Been thanked: 959 times

Re: Musk endeavours

#178889

Postby odysseus2000 » November 7th, 2018, 11:48 pm

BobbyD
My arguments are consistent with the approach of people who have functioning L4/5 systems... all of whom are commercial companies looking to gain a return on their investment. If it's a matter of sensor cost that will crash when you have full scale L5 car production on every continent, if it's a matter of processing power computer chips keep getting faster and cheaper. The only bit which is magic by today's standards is the Chimpiness of your code, and you can't buy Chimpiness.


No one has a functioning L5 system as far as I am aware.

For folk wanting to know what all the L's mean:

https://medium.com/iotforall/the-5-auto ... 2a5e834928

For level 5 there is no user input and no continuous monitoring of the car as is required by the Waymo permit, although it does have an impressive top speed of 65 mph:

https://www.autoblog.com/2018/10/31/way ... ccounter=1

It would be nice to see how this system would handle the blocked lane only revealed when the car in front swerves to avoid it as in the Tesla tests.

As I have typed several times there is no certainty as to what will be needed for a full level 5 system. It may require Lidar and a host of other stuff, it may never be possible or it may be possible using sensors that are similar to a human and suitable neural nets.

No one knows.

Making a system that is human like in its sensor abilities may be enough or it may never be enough. If it is then it is likely to become the dominant system as it will be much cheaper to make and can be deployed to existing cars.

Regards,

Itsallaguess
Lemon Half
Posts: 9129
Joined: November 4th, 2016, 1:16 pm
Has thanked: 4140 times
Been thanked: 10023 times

Re: Musk endeavours

#178903

Postby Itsallaguess » November 8th, 2018, 4:59 am

Interesting discussion.

One aspect that's been in my mind for some time now with regards to automated cars, and hasn't yet been discussed at all on-line as far as I can see, is to do with insurance scams.

We all know of the scams sometimes currently employed with human-driver vehicles, where unscrupulous people will set up situations on roads and at junctions and roundabouts that result in whiplash claims, but I'd be interested to see where those scams then develop if and when automated cars become much more prevalent.

We've already heard that there seems to be some specific scenarios (stalled traffic exposed to an automated car very late by a car in front leaving a lane) where automated cars simply can't handle the resulting situations, and I imagine those scenarios would quickly become exploited by insurance-scammers, keen to rake in the money from known failure-situations.

I thought I'd mention this as it's an area that I've not seen being considered during the many discussions here or elsewhere.

Insurance is going to be a very interesting aspect of car automation. We've got well-established processes for human-based accidents, and it's accepted that we're all capable of both causing and experiencing accidents where many humans are involved with moving vehicles, but it'll be interesting to see where the insurance situation goes in terms of large-scale car-automation, when it inevitably occurs...

Cheers,

Itsallaguess

dspp
Lemon Half
Posts: 5884
Joined: November 4th, 2016, 10:53 am
Has thanked: 5825 times
Been thanked: 2127 times

Re: Musk endeavours

#178933

Postby dspp » November 8th, 2018, 9:29 am

Itsallaguess wrote:Interesting discussion.

One aspect that's been in my mind for some time now with regards to automated cars, and hasn't yet been discussed at all on-line as far as I can see, is to do with insurance scams.

We all know of the scams sometimes currently employed with human-driver vehicles, where unscrupulous people will set up situations on roads and at junctions and roundabouts that result in whiplash claims, but I'd be interested to see where those scams then develop if and when automated cars become much more prevalent.

We've already heard that there seems to be some specific scenarios (stalled traffic exposed to an automated car very late by a car in front leaving a lane) where automated cars simply can't handle the resulting situations, and I imagine those scenarios would quickly become exploited by insurance-scammers, keen to rake in the money from known failure-situations.

I thought I'd mention this as it's an area that I've not seen being considered during the many discussions here or elsewhere.

Insurance is going to be a very interesting aspect of car automation. We've got well-established processes for human-based accidents, and it's accepted that we're all capable of both causing and experiencing accidents where many humans are involved with moving vehicles, but it'll be interesting to see where the insurance situation goes in terms of large-scale car-automation, when it inevitably occurs...

Cheers,

Itsallaguess


iag,

Keep up at the back :)

This was raised on this thread a few days ago :
*****

#178251 Postby dspp » November 5th, 2018, 2:38 pm

The issue is bad outcomes with false positives.

e.g. If the car slams the anchors on, thinking that something is in the highway, and the car behind it rear ends it, and there was nothing in the highway. That's a suefest.

You get a lot of false positives in the realworld.

The lawyers don't care if the autonomous system is in general better than the meat system. They sue about the particular instance.

That's before we even get to "cash for crash" fraud. Imagine gaming Teslas this way ........

*****

You'll also find insurance aspects being discussed by vendors and OEMs and legislators quite a lot if you dig around.

regards, dspp

odysseus2000
Lemon Half
Posts: 6364
Joined: November 8th, 2016, 11:33 pm
Has thanked: 1530 times
Been thanked: 959 times

Re: Musk endeavours

#178935

Postby odysseus2000 » November 8th, 2018, 9:32 am

This is an interesting video of an attempt at an insurance scam:

https://twitter.com/simonsugar/status/9 ... 87905?s=20

Kind of amusing in hindsight as it comes out of no where and must have been frightening at the time, but given how the scammers react when they realise they have been recorded suggests to me that to have effective insurance scams with robotic cars will be very much more difficult given all the cameras such vehicles have.

Most public service vehicles I have ridden in already have cameras and the prevalence of deliberately allowing a front leading vehicle to be rammed by slamming on the brakes right in front of say a bus type of crime has declined. My insurance company at my last renewal asked if I had a dash cam in my private car so perhaps that scam has moved to private cars.

Regards,

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#178954

Postby BobbyD » November 8th, 2018, 10:46 am

Tesla has appointed Robyn Denholm as chair of its board, replacing the electric carmaker’s billionaire founder Elon Musk.

Denholm, one of two female directors on the nine-member board, assumes her new position immediately, Tesla said. It will become full-time once she leaves her role as chief financial officer and head of strategy at Telstra, Australia’s largest telecom company, after her six-month notice period.

The 55-year-old Australian will temporarily step down as chair of Tesla’s audit committee until she quits Telstra.


- https://www.theguardian.com/technology/ ... -elon-musk

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#178968

Postby BobbyD » November 8th, 2018, 11:47 am

odysseus2000 wrote:For folk wanting to know what all the L's mean:

https://medium.com/iotforall/the-5-auto ... 2a5e834928

For level 5 there is no user input and no continuous monitoring of the car as is required by the Waymo permit, although it does have an impressive top speed of 65 mph:



That's a bad summary which isn't even consistent with the definitions given by the one of the two 'further reading suggestions' which is still available*.

The widely adopted 0-5 Level segmentation comes from the Society of Automotive Engineers (SAE International), but is behind a paywall. The NHTSA give the levels as:

LEVELS OF AUTOMATION WHO DOES WHAT, WHEN

Level 0 The human driver does all the driving.

Level 1 An advanced driver assistance system (ADAS) on the vehicle can sometimes assist the human driver with either steering or braking/accelerating, but not both simultaneously.

Level 2 An advanced driver assistance system (ADAS) on the vehicle can itself actually control both steering and braking/accelerating simultaneously under some circumstances. The human driver must continue to pay full attention (“monitor the driving environment”) at all times and perform the rest of the driving task.

Level 3 An Automated Driving System (ADS) on the vehicle can itself perform all aspects of the driving task under some circumstances. In those circumstances, the human driver must be ready to take back control at any time when the ADS requests the human driver to do so. In all other circumstances, the human driver performs the driving task.

Level 4 An Automated Driving System (ADS) on the vehicle can itself perform all driving tasks and monitor the driving environment – essentially, do all the driving – in certain circumstances. The human need not pay attention in those circumstances.

Level 5 An Automated Driving System (ADS) on the vehicle can do all the driving in all circumstances. The human occupants are just passengers and need never be involved in driving.

- emphasis added to improve legibility,


- https://www.nhtsa.gov/technology-innova ... lf-driving

* (The NHTSA is that one further reading recommendation still available from that bad summary, although here: https://www.nhtsa.gov/sites/nhtsa.dot.g ... 9a_tag.pdf )

You are also confusing the capability of the car with the legal requirements under which it is licensed to operate.

dspp
Lemon Half
Posts: 5884
Joined: November 4th, 2016, 10:53 am
Has thanked: 5825 times
Been thanked: 2127 times

Re: Musk endeavours

#179010

Postby dspp » November 8th, 2018, 3:51 pm

I appreciate it is a somewhat apples to oranges comparison, but here are the stats for how many miles each fleet has done on some version or other of autonomous as of late 2017:

Tesla: ~1 billion
Waymo: 4 million
Uber: 2 million
Cruise: ~0.5 million

Those numbers are from https://medium.com/self-driving-cars/mi ... 7bda21b0f7 but also match what I have crunched out of my own spreadsheet, which suggests Tesla are probably at 2.8 bln Autopilot miles now. They also match what you can get from https://www.youtube.com/watch?v=LSX3qdy0dFg and http://fortune.com/2018/05/31/whos-winn ... -car-race/. For Tesla I found one dataset from back in 2017 that gave a 24% engagement rate of autopilot per the whole Tesla fleet (not all of which has Autopilot enabled for the customer of course), which corresponds fairly well to the anecdote out there.

I appreciate that Tesla right now are at L2+ whilst the others are claiming L4 / L5. I don't think the comparison is entirely fair to Tesla as the Tesla miles are utterly wild, whereas the others are mostly geofenced and/or in well-trained environments and/or in highly constrained environments (e.g. Cruise on only freeway use). Similarly we don't know how much of the early (say) Waymo miles were really running at L2 etc.

Another relevant point that I have picked up is that most of the OEMs are in 3-year model cycles whereas Tesla is roughly in a one-year cycle. Combine both and you get an interesting race.

By the way here is an El Reg teasing of MobilEye https://www.theregister.co.uk/2018/05/1 ... rs/?page=2 .

Itsallaguess
Lemon Half
Posts: 9129
Joined: November 4th, 2016, 1:16 pm
Has thanked: 4140 times
Been thanked: 10023 times

Re: Musk endeavours

#179018

Postby Itsallaguess » November 8th, 2018, 4:18 pm

dspp wrote:
Keep up at the back

This was raised on this thread a few days ago :

*****
#178251 Postby dspp » November 5th, 2018, 2:38 pm

That's before we even get to "cash for crash" fraud. Imagine gaming Teslas this way ........

*****


Ah - I'll admit to missing that - so all I can say is that you've raised a very good point!

Cheers,

Itsallaguess

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#179176

Postby BobbyD » November 9th, 2018, 12:52 pm

dspp wrote:I appreciate that Tesla right now are at L2+ whilst the others are claiming L4 / L5. I don't think the comparison is entirely fair to Tesla as the Tesla miles are utterly wild, whereas the others are mostly geofenced and/or in well-trained environments and/or in highly constrained environments (e.g. Cruise on only freeway use). Similarly we don't know how much of the early (say) Waymo miles were really running at L2 etc.


Tesla have a lot of miles, but none of them are autonomous, and they include miles for which we know they don't have the raw data.

L4 is geofenced, or atleast that is the best implementation of L4 since if the design parameters are set in that way the car can operate totally autonomously within that area. Once you can remove that geofencing then you have L5! But then so is Tesla unless it can handle all road classes.

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#179187

Postby BobbyD » November 9th, 2018, 1:32 pm

VW plans to sell electric Tesla rival for less than 20,000 euros - source

FRANKFURT (Reuters) - Volkswagen (VOWG_p.DE) intends to sell electric cars for less than 20,000 euros (17,400 pounds)and protect German jobs by converting three factories to make Tesla (TSLA.O) rivals, a source familiar with the plans said.

...

Another vehicle, the I.D. Aero, will be built in a plant currently making the VW Passat, a mid-sized sedan, the source said.

...

An electric van, the ID Buzz, is due to be built at VW’s plant in Hannover, where its T6 Van is made, the source said.

To free up production capacity for electric cars in Hannover, VW’s transporter vans could be produced at a Ford (F.N) plant in Turkey, if German labour unions, who hold half the seats on VW’s board of directors, agree, the source added.


- https://uk.reuters.com/article/us-brita ... KKCN1NE104

dspp
Lemon Half
Posts: 5884
Joined: November 4th, 2016, 10:53 am
Has thanked: 5825 times
Been thanked: 2127 times

Re: Musk endeavours

#179191

Postby dspp » November 9th, 2018, 1:49 pm

BD,

I think the link is https://uk.reuters.com/article/uk-volks ... KKCN1ND2QP

The timeline I have previously seen associated with these VW plans is that the hatchback would become available in 2019, and other models would not follow before 2020 at best. I'm sure they can all make EVs, it is the rate at which they can get production up and running in volume that is in doubt for everybody, Tesla included. However right now we know Tesla/Panasonic are making at least 50% of the cells/packs.

Let's see on price/performance when the real product is available.

regards, dspp

odysseus2000
Lemon Half
Posts: 6364
Joined: November 8th, 2016, 11:33 pm
Has thanked: 1530 times
Been thanked: 959 times

Re: Musk endeavours

#179193

Postby odysseus2000 » November 9th, 2018, 1:59 pm

This is the world first? AI anchor.

If you didn't know would you believe it was human?

https://twitter.com/XHNews/status/1060161714123984901

Regards,

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#179204

Postby BobbyD » November 9th, 2018, 2:48 pm



Nope your link now goes to the same article mine has moved to....

The headline is: VW plans to sell electric Tesla rival for less than $23,000: source

and the current link is: https://www.reuters.com/article/us-volk ... SKBN1ND2C2

OK that's weird... now your link is working for me again... either way...

onthemove
Lemon Slice
Posts: 540
Joined: June 24th, 2017, 4:03 pm
Has thanked: 722 times
Been thanked: 471 times

Re: Musk endeavours

#179429

Postby onthemove » November 10th, 2018, 10:45 pm

odysseus2000 wrote:As I have typed several times there is no certainty as to what will be needed for a full level 5 system. It may require Lidar and a host of other stuff, it may never be possible or it may be possible using sensors that are similar to a human and suitable neural nets.


Apologies for jumping in ... I've been skimming through the thread, but quite distracted so have probably missed much

I've quoted the above, but to be fair, my response could be applicable to a number of comments by different posters...

There are a few considerations that I hope the industry - and regulators - will be working out...

The Minimum
I've seen a video from one research group where a (toy) car is being driven autonomously (for obstacle avoidance) using nothing more than a monocular camera. The deep learning network behind it (and I believe the sole purpose of that specific research) was to demonstrate what could be done with a single camera that had been trained to infer depth from 'familiarity' - i.e. the semantic information in each single image. For example, by inferring scale and distance from the expected size of things in the scene - like people, trees, buildings, and so on.

I've just googled to try and find the above one again - unsuccessfully so far - but have found this which makes specific use of the monocular video for estimating the drivable area in front of the vehicle (just one aspect of what would need to be inferred from the images)..

http://www.cs.toronto.edu/~yaojian/freeSpace.pdf
"In this paper we propose a novel algorithm for estimating the drivable collision-free space for autonomous navigation of on-road and on-water vehicles. In contrast to previous approaches that use stereo cameras or LIDAR, we show a method to solve this problem using a single camera."

All in all, I would say there is already enough research out there to be confident that you could in theory achieve a self driving vehicle using only 1 single camera.

But you would never find me riding in such a vehicle. For many reasons, but also importantly (and I don't think this has been mentioned, though I could easily have missed it)...

Fail Safety
Stuff fails. Sensors get obstructed.
(Human) Drivers tend to know their vehicle. They can feel when something isn't right. They can adapt - drive more slowly, allow more time when they observe a 'risk' ahead if they know their brakes are feeling soft (e.g. when children playing at the side of the road, etc).

With the driver out of the control loop - no longer feeling how the car responds to their tap on the accelerator, etc, that in itself is going to require additional work on the 'intelligence' side. The AD system will need to monitor how well its demands get translated into actions - acceleration, steering or braking - and adapt accordingly (since there is always going to be some variance, not just in over time in a single vehicle, but also from unit to unit off of the production line. But the AD will also need to determine when expected 'adaption' becomes a cause for concern requiring a trip to the garage.

But equally importantly, the system will need to be able to identify whether the sensors are operational - and when they need maintenance. They will need to distinguish between fog, spray and mist, compared to dirt on the lens that needs human intervention. They will need to identify smearing wipers on the camera's 'windscreen'. And so on.

Now to achieve this could be a sliding scale. Simply have 2 redundant sensors, and if they don't agree, then you stop the car, and tell the occupants to call out the breakdown truck.

But I suspect (hope!) that socially (i.e. through regulation) that that wouldn't be acceptable. There'd just be too many vehicles pulled up on the side of the road. And if one of two sensors fail - how do you know which? Could you even safely pull over to the side of the road if all you know is that the sensors don't agree - you don't actually know which you can trust to use in order to manouvre to a safe position.

I truly hope the regulators demand quite a high level of redundancy, purely to minimise the 'need to stop'. Far more sensors than is the theoretical minimum needed to actually achieve AD.

Expectations

AD only needs to match humans to theoretically justify itself. But the public would never buy that. The Daily Mail and other scaremongering newspapers would report every accident as though AD were something to be afraid of. They wouldn't care about whether it matches humans or not.

And I think most people - me included - would really expect some degree of safety improvement from AD compared to humans. After all, that is part of the AD promise. Alongside the ability to put up your feet and read a book, browse the web, watch a movie, etc, on your commute home in your own personal car, we are also being promised that AD cars will always be on the lookout, never get tired, never get distracted.

So there is definitely a perception and (quite genuine) expectation that AD will surpass human drivers in terms of safety. So again, it's not down to what a human can do - we want AD to do better. It's not down to what sensors would match a human - it is a question of what sensors give a realistic safety improvement.

To throw some around some numbers plucked completely out of thin air to illustrate what I mean .. If you could match a human with (say) 2 sensors and a couple of mirrors, but - at reasonable cost - could half the accident rate per mile by using 20 sensors.... then I hope the regulators across the globe would look towards making the latter the minimum requirements for AD.

Though only where reasonably practical and cost effective. Even if AD only equalled humans, and that was the best it could do economically, then OK it's still justified. But if a small investment more in sensors-per-vehicle could slash accident rates - then that surely has to be considered as being the minimum required regulations.

Modularity

One other, slightly tangential, consideration is what about modularity?

I'd be really curious to know already how the different manufacturers are architecting their systems for modularity.

I mean, a straight forward ANN (artificial neural net), when it is trained, it is trained with a fixed number of inputs and fixed number of outputs. I suppose the convolutional aspect (i.e. that the same 'weights' are repeatedely applied across all nodes in a layer) does actually perhaps allow for a degree of flexibility, though that is likely only going to be limited to the first layer.

But then how do you deal with multiple sensors?
I think in practice, it's almost certain that any ANNs related to vision are going to be specific to a single camera input - if you have two cameras, each will have their own ANNs, and then they perhaps feed into more conventional algorithms that build up a more conventional 3d model of the surroundings upon which the driving decisions are then taken (e.g. the sort of things that Waymo show when illustrating AD systems, and I get the impression that Telsa provide some sort of similar representation of what the car 'sees' around it).

But it does raise the question of what standards will emerge. I mean, simplisitically, you can't just plug a lidar into an ANN that was trained for vision/camera.

To be flexible with sensors - to do the sort of experimentation suggested by some - needs desigining in up front! I mean, if you just gave someone the remit to 'hey, use 20 sensors 10 lidar - get it working', and didn't make it clear that you wanted what they produced to be flexible to experimentation by reducing the number of sensors - in order just to get something going, they might simply feed all the sensors into a single big ANN and let that learn to handle all of them. The problem is, that if you were to take one of the sensors out, then you'd basically have to start the entire training again!

It's a challenge!

It's going to be very interesting to see how the industry adapts to modularisation with AD.

I can see that the industry will need it. I can't imagine any company will last long without it. I mean, different groups are good a different things. Ultimately modularity will win the day.

But how?

AD needs to be real time. I could imagine multiple 'modules' taking the input image/video from each camera.

  • You might have ANNs for object recognition (pedestrians, cyclists, other vehicles, etc) in order to reason about the 3d space - work out who's going where, what might collide etc.
  • Another ANN connected to the same camera but developed by a different team that concentrates on 'drivable' area - wheres the road edge, where are the potholes, etc.
  • You might have another (still connected to the same camera) ANN for looking for road markings, and other highway code signs related to directions, bus lanes, emergency vehicle lanes, no entry signs, etc.
  • You might have another ANN related to identifying traffic lights, pedestrian crossings and other scenarios which relate to the actionable rules of driving - where you might have to yield way in certain circumstances.
  • You might have another related to reading sign posts - diversions, speed limit signs, road works,
  • You might have another looking out for policemen/policewomen directing traffic (this is something that google were already demonstrating their systems could do before they split off into Waymo)

But then how do you standardize this information so that you could interchange modules? Certainly a very intersting technical challenge.

A lidar won't be able to recognise a pedestrian as a pedestrian. So the output of a lidar sensor / associated ANN isn't going to be the same semantic information as would come from the ANNs related to a camera - you couldn't swap a lidar module (incl ANNs) with a camera module (including ANNs)

In fact, in reality the lidar is likely to complement the camera (plus ANNs) outputs - a central AD processing unit would likely fuse these together in a complimentary way - overlaying the lidar information onto the elements identified by the camera / ANNs in order to add reliable depth information to the semantic information of the camera / ANN outputs.

When you look at it that way, I really can't imagine why anyone would dream of not fusing the two together. I mean, OK, lidar is only valid in certain circumstances - it doesn't cover everywhere, so yes, true, there would always be some areas of the images - outputs from the camera/ANN which wouldn't have lidar data, and the central AD processing unit would have to know how to deal with these - and you could argue, well if it can deal with those, it can deal without any lidar then... in theory yes, ..

..but really... lidar (and radar, ultrasound, etc) all work well at short to medium distances - i.e. where you are most likely to hit pedestrians, cyclists, animals, etc. It just seems insane not to include this extra dimension of information to make a far more robust model from which to act.

Anyway, it really does feel like we are in the midst of a revolution.

In another 5 years, when AD becomes more widespread and mainstream, it'll be interesting to watch what become the new buzzwords. What will the technical afficionados talk about - today the geeks talk about cylinders, exhausts, catalytic converters, fuel injection...

.. just imagine in 5 years time, I could easily see the language change completely - the talk being of what modules your car has fitted - what neural modules you have - what processing units you have processing this information - who manufactured them, etc.

To bring it on topic, I think this could trip Telsa up somewhat. They currently claim their cars have all the hardware they need for AD. I think in a few years, Telsa - and the cars' owners - will look back and think that was naïve.

Even if Tesla manage to get some software update AD retrofitted to their current hardware, I think the owners will see where the industry has moved to, and most probably won't make the software upgrade. I suspect most will rather go with the better modularized, better standardized vehicles that are still currently at the design / concept stage.

Companies like Waymo and nVidia who arean't actually producing their own cars, are entirely reliant upon their offering being something that can be modular fitted to other companies vehicles.

And quite rightly, different companies will demand different sensors and sensor configurations to differentiate themselves from the competition.

Consumers will demand different - I mean, those paying large sums - the BMW / Merc / Audi drivers of today will demand far better, far more powerful sensors than the bottom end models.

And the AD processing units from Waymo and Google will have to adapt to this different demand.

Moreover, they will ultimately need to build their systems that allow other sensor manufacturers / developers to fuse their sensors into the nVidia and Waymo systems.

If a company like Telsa has already built cars not designed for this flexiblity, then I suspect they'll be left behind.

I suspect that the commercial reality will quickly mean Telsa (and the rest) moving towards more open standards. But that won't be good for cars built before those interchangeable standards were finalised. If I were buying a Telsa car now, I wouldn't get too excited about the promises of full automation on the existing hardware from future software updates.

I think ultimately - once the dust starts to settle, and the regulations start to catch up - I think the complexity of the AD modules (at the module level, with all the different ANNs serving different functions, getting fused into additional processing modules, etc) could match or surpass the complexity of the mechanical components within an engine.

And with the need for these modules to be hard real time (https://whatis.techtarget.com/definitio ... ime-system ) / safety critical, I just cannot see how you could hope to achieve this from a purely software upgrade.

My current bet would be that the ANNs get manufactured into dedicated neural network circuitary that is self contained within the module (there are already prototypes that show very fast, very low power function of neural networks if you put them onto a chip dedicated for that purpose - for AD it isn't going to be something that is computed on a a regular general purpose CPU)

It's tempting to think you wouldn't buy a camera, you'd instead buy a camera + ANN as a single package, with a standardised output that you plug into your AD controller.

But even that is probably not enough. With the vaious functions that you could perform from the image from a camera (see list above).

And modularity doesn't necessarily mean consumer level modularity. It maybe that the modularity is like different integrated circuits that the manufacturer can choose to include in the circuitary for a specific model. But it doesn't change the need for some degree of standardization of the inputs and outputs of the various modules.

Anyhow, sorry for rambling.... if only this technology were coming around 15 to 20yrs ago, shortly after I left uni.... I'd love the technical challenge of working properly on a serious AD system (not a toy hobby system) ... I was creating a neural net (before deep learning came around) to navigate a robot around a maze back when I was at uni... I would have loved to have progressed onto working on proper, real world AD systems.


For what it's worth (probably not a lot, though it may give an idea how I see the industry) ... I'd more than happily go to work for Waymo / nVidia or any of the established car manufacturers to work on AD systems.

I'd be more hesitant at working for Tesla / Uber.

Why? Gut feeling.
Put simply - Waymo and nVidia seem to be serious about doing proper - 'offline' (Away from consumers) - development before unleashing on the real world consumer. And that is good. Similarly for the established car manufacturers, they already have experience of developing 'enhanced' systems (like antilock brakes, etc) that are of a safety critical nature, and they understand the need for proper engineered development.

Tesla / Uber - purely in my own personal opinion - seem to be a little too cavalier at pushing our their attempts too soon. It has come as no surprise to me that they are the first to have resulted in fatalities. I don't think my style of working / my approach to development and design would suit working at either of these companies for AD.

And for the same reason, I also suspect that Tesla and Uber are also inadvertantly setting themselves up to get wrong footed by regulations.

If the other manufacturers can demonstrate substantially better safety and better reliability with their lidar (or other additional sensor fusion), etc, then I suspect - hope even - that the regulators will look to make those kinds of systems the model for regulations.

And that could really throw a spanner in the works to other efforts that have tried to cut corners, do things on the cheap, skimp on sensor inputs, etc.

Sorry .... I'll stop now... :)

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#179437

Postby BobbyD » November 11th, 2018, 3:33 am

Yowzers!

onthemove wrote:
All in all, I would say there is already enough research out there to be confident that you could in theory achieve a self driving vehicle using only 1 single camera.


That's a very forward looking set up... I bet lane changing would be exciting!

Can't for a number of reasons see it ever being licensed...

onthemove wrote:AD only needs to match humans to theoretically justify itself. But the public would never buy that. The Daily Mail and other scaremongering newspapers would report every accident as though AD were something to be afraid of. They wouldn't care about whether it matches humans or not.


Neither would governments. Just being as good as it's predecessor in safety terms hasn't been acceptable for cars in a long time, and AD will offer a prime opportunity to push this further, largely at other people's expense...

I mean, a straight forward ANN (artificial neural net), when it is trained, it is trained with a fixed number of inputs and fixed number of outputs. I suppose the convolutional aspect (i.e. that the same 'weights' are repeatedely applied across all nodes in a layer) does actually perhaps allow for a degree of flexibility, though that is likely only going to be limited to the first layer.


Isn't sensor redundancy going to demand some degree of flexibility? We talk about having an AD in the singular, but would that really be the case?

To be flexible with sensors - to do the sort of experimentation suggested by some - needs desigining in up front! I mean, if you just gave someone the remit to 'hey, use 20 sensors 10 lidar - get it working', and didn't make it clear that you wanted what they produced to be flexible to experimentation by reducing the number of sensors - in order just to get something going, they might simply feed all the sensors into a single big ANN and let that learn to handle all of them. The problem is, that if you were to take one of the sensors out, then you'd basically have to start the entire training again!


I think of it more along the lines of if you've got it working with 20 sensors and nobody has died, then you might have a crack at a next generation with 18 sensors... The problem is no longer can you do it, but how efficiently can you do it. If you start off with two plastic cups and a length of string you'll never learn whether you are trying to do what is currently beyond your capabilities, or whether you are trying to do something which is very hard with an inadequate toolbox.

I can see that the industry will need it. I can't imagine any company will last long without it. I mean, different groups are good a different things. Ultimately modularity will win the day.


I think companies recognise this. There is a surprising amount of cooperation going on, although not universally.

It's tempting to think you wouldn't buy a camera, you'd instead buy a camera + ANN as a single package, with a standardised output that you plug into your AD controller.


Isn't this just an extension of MobilEye's annoying Tesla by only outputting processed data?

BobbyD
Lemon Half
Posts: 7814
Joined: January 22nd, 2017, 2:29 pm
Has thanked: 665 times
Been thanked: 1289 times

Re: Musk endeavours

#179438

Postby BobbyD » November 11th, 2018, 3:42 am

Ah, the thing I actually meant to post...

I'm not stumping up $3800 for Navigant's Automated driving Leaderboard https://www.navigantresearch.com/report ... g-vehicles

But the results graph for a recent copy is shown in the last figure in this article: https://eu.usatoday.com/story/money/car ... 800037002/

19 companies ranked, have a think about your running order before taking a peek.

This is how it looked a year ago: http://uk.businessinsider.com/the-compa ... rst-2017-4


Return to “Macro and Global Topics”

Who is online

Users browsing this forum: Watis and 21 guests