Wednesday, December 12, 2018

TESLA & AR Claiming some turf

I'm not a legal guy, but the text of the patent reads like it's some kind of legal shotgun, hoping to land on some use of AR that someone else hasn't already claimed.

Whether that's the case or not, it appears they're planning to use AR to improve their manufacturing processes.


In the text of the patent application (it's an application, not a grant) they use the "in some embodiments" phrase a lot, which is what makes me think they're just trying to claim some turf in AR -- just in case there are any specifics some of the other players missed.

It could mean they intend to come up with their own version (hardware / software) of AR devices, and this could be an interesting addition to the space. 


MVIS will still have the best near-eye display.

Google has already been using wearable tech.
The image shows multiple black Tesla Model S vehicles in view with a screenshot from APX’s “Skylight” software in the top-right hand corner. This is a familiar view for those who have used Glass before — the device is actually capable of capturing similar shots itself called Vignettes. In the screenshot, the software shows what is purportedly a vehicle’s VIN number, and various options for a factory worker to act on.

Electrek

ZDNet

In some embodiments, the AR glasses may be in the form of safety glasses. The AR device captures a live view of an object of interest, for example, a view of one or more automotive parts. The AR device determines the location of the device as well as the location and type of the object of interest. For example, the AR device identifies that the object of interest is a right hand front shock tower of a vehicle. The AR device then overlays data corresponding to features of the object of interest, such as mechanical joints, interfaces with other parts, thickness of e-coating, etc. on top of the view of the object of interest. Examples of the joint features include spot welds, self-pierced rivets, laser welds, structural adhesive, and sealers, among others. As the user moves around the object, the view of the object from the perspective of the AR device and the overlaid data of the detected features adjust accordingly. The user can also interact with the AR device. For example, a user can display information on each of the identified features. In some embodiments, for example, the AR device displays the tolerances associated with each detected feature, such as the location of a spot weld or hole. As another example, the overlaid data on the view of the object includes details for assembly, such as the order to perform laser welds, the type of weld to perform, the tolerance associated with each feature, whether a feature is assembled correctly, etc. In various embodiments, the AR device detects features of a physical object and displays digital information interactively to the user. The data associated with the object of interest is presented to help the user more efficiently perform a manufacturing task.”

No comments:

Post a Comment