Monday, May 21, 2018

Manager's guided to Augmented Reality

Probably a good thing to be the company providing the best displays for this technology.

Harvard Business Review (more at the source)

Augmented reality, a set of technologies that superimposes digital data and images on the physical world, promises to close this gap and release untapped and uniquely human capabilities. Though still in its infancy, AR is poised to enter the mainstream; according to one estimate, spending on AR technology will hit $60 billion in 2020. AR will affect companies in every industry and many other types of organizations, from universities to social enterprises. In the coming months and years, it will transform how we learn, make decisions, and interact with the physical world. It will also change how enterprises serve customers, train employees, design and create products, and manage their value chains, and, ultimately, how they compete.

More broadly, AR enables a new information-delivery paradigm, which we believe will have a profound impact on how data is structured, managed, and delivered on the internet. Though the web transformed how information is collected, transmitted, and accessed, its model for data storage and delivery—pages on flat screens—has major limits: It requires people to mentally translate 2-D information for use in a 3-D world. That isn’t always easy, as anyone who has used a manual to fix an office copier knows. By superimposing digital information directly on real objects or environments, AR allows people to process the physical and digital simultaneously, eliminating the need to mentally bridge the two. That improves our ability to rapidly and accurately absorb information, make decisions, and execute required tasks quickly and efficiently.

Saturday, May 19, 2018

Great moments in Investing

I just saw this picture, and thought it was awesome.... but current market cap of Apple puts 10% at 915.74B/10 or 91 Billion...

Friday, May 18, 2018

Message to Shareholders from the CEO

If they were inclined to dilute for the hell of it, we would have had this request long ago.... 

I'm for this. It gives them options to deal with threats, embrace opportunities, enter into strategic partnerships, or simply reward shareholders. (I think a forward split during a good run, with good publicity would be a gift to shareholders.) 

May 18, 2018

Dear Fellow MicroVision Stockholder,

With a little more than six months as CEO of your Company behind me, I am convinced that we are now moving in a direction that will pave the way to achieving profitability at some point during 2019. As we discussed on our May 9 conference call with the investment community, we are on track with our plans, including:

  • The recently announced license agreement for display-only products with a leading global technology company that includes the payment of $10 million in license fees this year and provides MicroVision with additional payments for nonrecurring engineering expenses as well as other expected benefits.
  • The achievement of key milestones for the $24 million contract we signed in April 2017 with a Tier 1 technology company.
  • We made samples of our new MEMS scanner and our new time-of-flight ASIC available for customers’ evaluation.
  • Our next-generation analog ASIC has been taped out and we made significant progress on our next-generation digital ASIC.

We believe the markets we are targeting have exceptional growth potential. We also believe our technology and the underlying ability to deliver has never been better and with a very customer focused value-driven concentration of effort, we are positioned to increase shareholder value, but we need your help.

Your Vote is Important to MicroVision’s Success

It is important that you vote your shares at the upcoming 2018 Annual Meeting of Stockholders in favor of each of the proposals, particularly Proposal 3 relating to an increase in our authorized shares. Please note that the approval to increase the number of authorized shares does not mean the shares will be issued, only that they are available if needed. The shares will provide us with the capital flexibility we need as we continue to execute on our strategic plan. Also, it is important to note Proposal 3 requires approval by the holders of a majority of our outstanding common stock, not voting is the same as a vote against the proposal. Your vote is extremely important regardless of the number of shares you own.

While our Board of Directors (the “Board”) unanimously recommended that you vote FOR each of the seven Board nominees and FOR each of the proposals, I would like to take this opportunity to highlight Proposal 3 and discuss why the Board has determined that the proposal is in your best interest.

Proposal 3 asks you to approve an amendment to our charter to increase the authorized number of shares of the Company’s common stock by 50,000,000 to a total of 150,000,000 shares. This proposal benefits our stockholders in the following ways:

  • Provides Shares for Future Financings: Without an increase in the number of authorized shares, we expect that the Company will be limited in its ability to raise needed capital in the future. Currently we have approximately 14 million shares of common stock available for issuance. The Company expects to need additional capital to fund its business. If we do not have shares of common stock available to issue, we would be severely restricted in our ability to raise additional funds needed for our continued operations. While the Board is sensitive to the dilution that results from issuing additional shares of common stock and will continue to weigh that concern, it believes the stockholders will be best served by the Company having shares of common stock available for financings or other transactions that it determines are worth pursuing.

  • Provides Flexibility: The Board would like to increase the number of authorized shares of common stock to provide the Company with flexibility to issue shares of common stock for general corporate purposes, which could include, among other uses, financings, as discussed above, as well as for strategic partnering arrangements, equity incentive plans, acquisitions of assets or businesses, stock splits or stock dividends. The availability of additional authorized shares of common stock would allow the Company to accomplish these goals, and other business and financial objectives, in the future without the delay needed for authorizing shares for those purposes at the time. Increasing the number of authorized shares at this time does not mean we have current plans to issue these shares, only that they are available if needed by the company.

  • A Leading Independent Proxy Advisory Firm Recommends Approval of Proposal 3. Glass Lewis & Co., a leading institutional proxy advisory firm, recommends voting for Proposal 3 and concludes that is prudent for the Company to have additional common shares available for issuance and the request to increase the authorized common shares is “reasonable.”

Thursday, May 17, 2018

Job Posting....

I love seeing great confidence.

From Silkroad....

Thank you Joe S!

Senior LiDAR Systems Engineer

Tracking Code
Job Description

MicroVision has developed a highly disruptive automotive LiDAR sensor and computing platform based on the ultra-miniature 3D laser scanning technology pioneered by the company. We are using the latest advancements in machine learning, computer vision and computing technologies to give our LiDAR sensors amazing perception capabilities for autonomous vehicles. This is not your grandfather’s low-resolution spinning-bucket-of chicken-on the-roof-of-the-car LiDAR sensor. No, this is a LiDAR sensor that is half the volume of your cellphone and costs about as much. It produces beautiful, dense point clouds that look like high resolution video and, importantly, can be mass produced in quantities in the millions.

We continue to build out our world class engineering team and have an immediate opportunity for a badass Senior LiDAR Systems Engineer at our Redmond, WA facility. Do you like finding elegant solutions to really hard problems? Do you find that engineering comes naturally to you and you’ve always dug deeper to understand the underlying physics of a problem? As a kid, did you spend hours trying to find a generalizable closed-form solution to the three-body problem because you didn’t believe your high-school physics teacher? Are you the type of engineer that craves an environment where you’re close to blue lakes, green forests, and it occasionally rains? If this describes you, then we want to hear from you.

Our team is made up of the very best minds from industry. We are experts in autonomous robotics, computational computer vision, fault-tolerant spacecraft avionics, automotive and consumer electronics. We’ve played significant roles at Google, SpaceX, Lytro, Velodyne LiDAR and within NASA/JPL’s avionics and robotics communities. As a hardcore engineering team, we don’t believe in half measures and we’re looking for more engineers just like us.

Required Skills

We are looking for a Senior or Principal Systems Engineer that is comfortable working across the optical, electro-optical, MEMS, mechanical & electrical engineering disciplines to help model and plan our next generation LiDAR photonics module. This is a full-time salaried position at our Redmond, WA facility.

What you’ll be doing:
Work within a small, agile engineering team to help define, model, prototype, design, deliver and mass produce a world-class LiDAR sensor. 

Maintain and manage product system, sub-system and component requirements and specifications thru the entire product life cycle.

Perform hands-on modeling, design, testing and evaluation of new LiDAR and imaging hardware & software.Required Experience

An analytical mindset to probe, test and decompose challenging system problems into root cause issues with strong, detail oriented skills to solve them.

Intuitive understanding of optoelectronic and opt-mechanical design, packaging, assembly, integration and test principles.

Excellent teamwork skills are essential as we have a strict no jerk policy.

(A question to ask yourself as you look at this: Velodyne is making LiDAR for cars that is large, and they charge $75,000 for it. (LATimes) They have other units, with the cheapest going for $4,000. What happens when it can be produced in mass quantities, in much smaller units, for the price of your smartphone?

Wednesday, May 16, 2018

My bet on "Black Box"

This is just an educated guess. I readily admit that I could be very wrong about this. This is just dot connecting from publicly available information.

This is what I suspect and why.

In the fall we were told about the "Black Box" project that once the development was complete, we would be able to sell the results to anyone.

If someone is paying for development, it's exceptionally odd that they would allow anyone else to use what they paid to develop -- unless of course they would benefit if anyone else used it.

So who would benefit from anyone's use of it? Companies that do things on a variety of devices -- 

There are various common computer operating systems, stationary and mobile: Windows, Linux, iOS and Android (google)

What services are likely to be used on all operating systems? and are "Tier 1"


So, that's my short list. 

It doesn't matter what operating system you're using... people will still use all three of these. You would connect to Google if you used an Amazon device or to Amazon if you used a Google device. Same for Facebook. No matter who produces the device, you would still use the other service -- and in my mind, that's the only way you'd pay to develop some hardware, and then allow (and encourage) your development partner to sell it to anyone and everyone they want to.

If you add other system software to it you add Microsoft;  Apple is also doing some interesting things with AR and sharing it, they're opening up. 

So Microsoft and Apple are also contenders, but less, in my mind than the other three.

So, go back to look at the 4th Quarter Conference call. (delivered during Q1)  

What are all of those companies working on? (all five). 

Smart homes -- The description delivered during the conference call sounds like that. Review the Question and Answer portion of that CC:

Microvision Q1 2018
Perry Mulligan / Q1 Conference Call

It is. So the capabilities I'm describing in this perceptive element exist within the context of the 1-meter, 1.5-meter interactive display, the 10-meter display and the 30-meter display and perhaps the -- sorry, not display, LiDAR. Perhaps the best way to think of these, Henry, is to think of them as having sort of different levels of vocabulary. The things that the 1 meter display -- or the 1 meter LiDAR will have to recognize will be a relatively small number of things, gestures, point, touch, compression, squeezing the picture, flipping the page. And the 10-meter LiDAR, you can see how that number of things that would have to be recognized will increase for the device to be able to send a message that says it's your child walking towards an open door versus your dog running through a dog port would be an example of how those differences Maybe it's you walking down the hall past the bookcase, so don't turn the lights on for the bookshelf, or your wife walking towards the bookcase to get a book, go turn the lights on there and illustrate it. So those -- you can see that the language or the vocabulary perhaps of the device would increase and then within the automotive space would increase again.

....think of our display engine embedded in your voice-only device. So that as you shave in the morning not only do you listen to the news, you see it displayed on the washroom wall, and that becomes a little bit more meaningful experience. As you walk down the hall towards the kitchen, our sensing device knows it's you that's walking down the hall. It adjusts the coffee and turns the lights on appropriately. And then an interactive display that's invisible, but when you call it up, it comes out as an Alexa-type device or something of that nature that allows you to interact with it because of the sensing capabilities, gesture recognition and then disappears when it's not required. So we really see this as sort of a suite of solution that helps AI platforms with their user interface. 

So, look at what is happening with these companies:

Google  Integrating Smart Home Devices --- Clearly designed so that people can integrate google Smart Home into OTHER manufacturers devices. This is thorough explanation of how to do it.


Amazon Connect your devices to Alexa  At CES I had a very interesting conversation with one of the people doing Amazon Alexa certification. She reported that anyone can make a device that uses Alexa, but they must submit it to be an "approved" Alexa using device and sell it on Amazon. 

Facebook Portal  Remember they had a privacy snafu, and delayed the announcement. 

Visit an Amazon Smart Home (It's on my near-term agenda)

Cortana Smart Home

Apple Home Kit

Amazon Smart Home Devices (clearly willing to work with anyone)

Hints about "machine learning"

 there are solutions out there today that do 3D scanning, perhaps as an example, for facial recognition. They require high compute energy and use approximately 30,000 points to do that calculation. Our range of solutions will provide between 5 million and 20 million points per second of resolution in the 10-meter space. So the density of the information we have at the sensor allows us to make simple messaging analytics or messaging content that enables users to do so much more with the device than simply trying to plug them with this plethora of data. It is almost diametrically opposed to the way most entities are solving sensing applications today. Almost everybody is trying diligently to get more information from the sensor, pass it down the pipe to a centralized processor that allows it to do a calculation and figure out what's going on. We have so much information at the sensor. We have the luxury of sending messaging, which just makes it much easier for the entire system to be responsive. And it will be a shame not to capture that.

Friday, May 11, 2018

Another Visit from Randy

I've been busy, and contemplating the CC, past CC's, the license agreement, and other recent news.

The recent license agreement seems to indicate that enough cash will be available for everything. There is a key phrase in the press release about the purchase of MEMS and ASICS. I would like to know what the margins on those components will be. Either way it looks like in the next year a minimum of 25-35 million in revenue to the company.

This does NOT include any exclusive right to other parts of their technology portfolio. (Not LiDAR / 3D sensing, not interactive Display, not Near Eye Display.) Which means this is in addition to other things we think are probably going to be bigger money makers.

"The agreement does not include a license to MicroVision technology related to interactive displays, LiDAR and 3D sensing, or near-eye applications, such as augmented or virtual reality, which remain key vertical markets for MicroVision."

There have been some assumptions about display only PicoP -- that it wasn't going to earn much for the company, the company didn't care, etc. I was never a believer in that theory. That display technology answers two huge problems for the most popular consumer electronics product. (said so the other day here) 

Consumers universally want their smartphones to be physically smaller with a larger display.

Product Adoption -- can happen at a stunning rate. 

Cellphones went from nothing to huge very quickly, and cameras in smartphones even faster. One of the things about cameras, is that people like to share the pictures they take -- MORE than they like to take the pictures.
Now, the idea that the company would be forced to raise money is mostly off the table. (This licensing agreement isn't going to be the only revenue source) 

Shorts -- (about 12% of the float) stand a pretty good chance of being in trouble.

One mention of MicroVision on financial media in the same breath as any specific "Tier 1" technology company, and things will change completely.

When that happens new potential investors will start learning about it the way it is today.. (They won't care how much pain investors in the company from the past 20 years have felt, not even a little bit.) 

Things are looking really good. 

When  we get mentioned with one of these companies, and some of the investors take just a little of their money to invest in MVIS... things could get VERY interesting, very quickly.

Thursday, May 10, 2018

Alexa, Apple, Google.... Black box?

Have stuff to say, but the kid needs a ride.

CC Impressions


Amazon wants to show you how much you can rely on its Alexa voice assistant to control everything in your home, from thermostats to TVs.
The company announced Wednesday it's partnering with homebuilder Lennar to create "Amazon Experience Centers" — model homes containing built-in Alexa-controlled appliances — in hopes of persuading homeowners and homebuyers to embed Amazon services into their houses.
Lennar is outfitting model homes with TVs connected to Amazon's Fire TV system, buttons that can reorder goods from Amazon, and Alexa-enabled products that allow customers to control the lights, TV, shades and other appliances with their voices. Customers can tour the homes in cities around the U.S. to get an idea of how they work.
Apple has taken a similar approach. It has already teamed with homebuilders to create model new homes with HomeKit-ready products that can be controlled from an iPhone or iPad with Apple's Siri assistant. Brookfield Residential is one such homebuilder.
For Amazon, it's another example of how badly it wants to be in your home. Doing so lets it collect valuable data on how you purchase goods.
Apple, Amazon and Google are all racing to get smart assistants into your homes. Most households probably only have room for one of those — so getting customers into the ecosystem when they're buying a home might keep them for the long haul.
( I have scraped this to not lose it.)

Wednesday, May 9, 2018

MicroVision, Inc. Announces New License Agreement with a Leading Technology Company

So, a Leading global technology company: The most likely candidates I would consider are those that already have a working relationship with MicroVision:

Foxconn / Sharp 
ST Microelectronics (I think this is the most likely.)


REDMOND, Wash., May 09, 2018 (GLOBE NEWSWIRE) -- MicroVision, Inc. (MVIS), a leader in innovative ultra-miniature projection display and sensing technology, today announced that it has entered into a license agreement with a leading global technology company to allow the licensee to use MicroVision’s display technology to manufacture and sell display-only engines based on MicroVision reference designs.  The agreement grants a world wide, exclusive, five-year license to display-only technology.  In order to maintain exclusivity, the licensee is required to purchase  minimum quantities of MEMS and ASICs from MicroVision.  
“We are excited to announce this agreement with a recognized world leader in the manufacturing and sale of electronic products and components,” said Perry Mulligan, MicroVision’s Chief Executive Officer.  “This license agreement for display-only products validates our product roadmap and Go-To-Market strategy of targeting five key vertical markets. We believe our licensee's manufacturing prowess and use of our Laser Beam Scanning technology will assist MicroVision in lowering product costs and securing new customers in the other verticals that we serve.”
The licensee has agreed to pay MicroVision a license fee of $10 million in 2018.  An initial payment of $5 million is scheduled to be paid this quarter and a second payment of $5 million is scheduled to be paid in October.  As part of the agreement, MicroVision expects to receive additional payments for non-recurring engineering expenses and services associated with process and product transfer and qualification milestones.   
The agreement does not include a license to MicroVision technology related to interactive displays, LiDAR and 3D sensing, or near-eye applications, such as augmented or virtual reality, which remain key vertical markets for MicroVision.

Tuesday, May 8, 2018

Verizon's First 5G installation will be in Sacramento

Getting much closer to reality for all of us.

From 5G

The fixed 5G equipent -- using Verizon's own "5GTF" specification -- will provide a wireless 1-Gbit/s connection over-the-air for homes to surf the web, stream video and make live virtual reality (VR) connections, according to Samsung.

Verizon Communications Inc. (NYSE: VZ) says it will launch three to five fixed 5G by the end of this year. Sacramento is the only market that Big Red has announced so far. (See Verizon Says 'Up to 5' Fixed 5G Markets Will Go Live in 2H18 and Is Atlanta the Next 5G Market for Verizon?)

The Federal Communications Commission (FCC) has now approved Samsung Corp. 's indoor 5G router. Previously, it has approved Samsung's access unit and outdoor 5G router. Samsung says this now means that its "end-to-end system" is ready for Verizon to use.

This 28GHz system will likely be the first commercial millimeter wave (mmWave) 5G system ready for use in the US. "The indoor 5G home router is a small consumer device that receives and transmits the 5G signal to provide ultra-high speed broadband wireless service," Samsung says, in a statement, this shares this 5G signal to multiple devices via WiFi. (See Samsung Supplying 5G for Verizon's First Market and Verizon's Fixed 5G: Are You Ready for the Wireless Gig Rush?)

Uber's Autonomous Car Detected a Pedestrian, chose not to stop

This is where the benefit of a steerable and dense point cloud comes in handy. Or a LiDAR that can tell the difference between a cat and a brick.

Technology Review

The Information


"Pedestrian Fatalities Skyrocket"

The news: According to a report by the Information, the vehicle’s software did in fact detect the pedestrian, but it chose not to immediately react. The car was programmed to ignore potential false positives, or things in the road that wouldn’t interfere with the vehicle (like a plastic bag). But those adjustments were taken too far.

Why? The car may have been part of a test for increased rider comfort. Autonomous cars aren’t known for their smooth rides, and by ignoring things that are probably not a threat, a vehicle can cut down on the number of start-and-stop jerks riders experience.

What’s next? Uber is conducting a joint investigation with the National Transportation Safety Board, after which more details are expected to be released. In the meantime, the report could inspire other self-driving-vehicle companies to treat potential false positives with more caution.

(Scraped to retain.)