Microsoft open-sources a crucial algorithm behind its Bing Search services

Microsoft today announced that it has open-sourced a key piece of what makes its Bing search services able to quickly return search results to its users. By making this technology open, the company hopes that developers will be able to build similar experiences for their users in other domains where users search through vast data troves, including in retail, though in this age of abundant data, chances are developers will find plenty of other enterprise and consumer use cases, too.

The piece of software the company open-sourced today is a library Microsoft developed to make better use of all the data it collected and AI models it built for Bing .

“Only a few years ago, web search was simple. Users typed a few words and waded through pages of results,” the company notes in today’s announcement. “Today, those same users may instead snap a picture on a phone and drop it into a search box or use an intelligent assistant to ask a question without physically touching a device at all. They may also type a question and expect an actual reply, not a list of pages with likely answers.”

With the Space Partition Tree and Graph (SPTAG) algorithm that is at the core of the open-sourced Python library, Microsoft is able to search through billions of pieces of information in milliseconds.

Vector search itself isn’t a new idea, of course. What Microsoft has done, though, is apply this concept to working with deep learning models. First, the team takes a pre-trained model and encodes that data into vectors, where every vector represents a word or pixel. Using the new SPTAG library, it then generates a vector index. As queries come in, the deep learning model translates that text or image into a vector and the library finds the most related vectors in that index.

“With Bing search, the vectorizing effort has extended to over 150 billion pieces of data indexed by the search engine to bring improvement over traditional keyword matching,” Microsoft says. “These include single words, characters, web page snippets, full queries and other media. Once a user searches, Bing can scan the indexed vectors and deliver the best match.”

The library is now available under the MIT license and provides all of the tools to build and search these distributed vector indexes. You can find more details about how to get started with using this library — as well as application samples — here.

AWS and Microsoft reap most of the benefits of expanding cloud market

While it appears that overall economic activity could be slowing down, one area that continues to soar is the cloud business. Just this week, Amazon and Microsoft reported their cloud numbers as part of their overall earnings reports.

While Microsoft’s cloud growth was flat from the previous quarter, it still grew a healthy 76 percent to $9.4 billion, or a $37.6 billion run rate. Meanwhile AWS, Amazon’s cloud division, grew 46 percent to $7.4 billion, or a $29.6 billion run rate. That’s up from $5.11 billion a year ago. As always, it’s important to remember that it isn’t necessarily an apples to apples comparison, as each company counts what they call cloud revenue a little differently, but it gives you a sense of where this market is going.

Both businesses also face the law of large numbers in terms of growth; that is, the bigger you get, the harder it is to keep growing at a substantial rate. The two companies are doing quite well, though, considering how mature their offerings are.

Last year Synergy Research reported the overall cloud market worldwide grew 32 percent to $250 billion. In Synergy’s last report on cloud market share in October, it had Amazon well in the lead, with around 35 percent and Microsoft around 15 percent. A Canalys report from the same time period had AWS with 32 percent and Microsoft with 17 percent, so close you could call it a tie for statistical purposes.

Alibaba just reported earnings was up 84 percent, but only have a small worldwide market share. IBM, which bought Red Hat for $34 billion last year hoping to grab a bigger piece of the hybrid cloud market, reported cloud revenue was up only 12 percent for 2018 in its earnings report last week, which seems pretty paltry compared to the rest of the market. It’s worth noting that the Red Hat sale won’t close until later this year. Google will be reporting at the beginning of next week, but has not been breaking out cloud revenue recently. It will be interesting to see if that changes.

Most experts agree that we are just beginning to scratch the surface of cloud adoption and that the vast majority of workloads are still locked in private data centers around the world. That means even if there is a broader economic downturn in the future, the cloud could be somewhat insulated because companies are already in the process of moving parts of their businesses to the cloud.

As these companies grow, it requires increasing numbers of data centers to deal with all this new business, and a Canalys report found that Microsoft and Amazon have been busy in this regard. Amazon currently has 60 cloud locations worldwide, with another 12 under construction. Canalys reports that the company’s CapEx spending (which includes non-data center spend) reached $26 billion, up a modest 7 percent. Meanwhile Microsoft, which is chasing AWS, had much more aggressive infrastructure spending, with expenditures up 64 percent to $14 billion.

You can expect that unless something drastic happens, the market pie will continue to expand, but the numbers probably won’t change dramatically as these two market leaders have hardened their market positions and it will become increasingly difficult for competitors to catch them.

Microsoft confirms Bing is down in China

Microsoft’s Bing is down in China, according to users who took to social media beginning Wednesday afternoon to complain and express concerns.

The Seattle-based behemoth has confirmed that its search engine is currently inaccessible in China and is “engaged to determine next steps,” a company spokesperson said in a statement to TechCrunch Thursday morning.

Citing sources, the Financial Times reported (paywalled) on Thursday that China Unicom, a major state-owned telecommunication company, confirmed the government had ordered a block on Bing.

Public reaction

The situation appears to be a DNS (domain name system) corruption, one method for China to block websites through its intricate censoring system called the Great Firewall. When a user enters a domain name associated with a banned IP address, the Firewall will corrupt the connection to stop the page from loading.

Several users told TechCrunch they are still able to access Bing by directly visiting its IP address as of Thursday morning.

Other users writing on social media believe the block is a result of Bing’s server crash after a viral article (link in Chinese) attacking Baidu’s search quality directed traffic to its lesser-known American rival. Many referred to a Chinese report that says high traffic from Baidu had crashed Bing. The article, published by Jiemian, a news site under the state-owned Shanghai United Media Group, now returns a 404 error.

Microsoft has long tried to play by China’s rules by filtering out sensitive results from its search engine. It also modified Windows 10 for China back in 2017 through a collaboration with state-owned China Electronics Technology Group to eliminate Beijing’s fears of possible backdoors in the American software. Former Microsoft executive Steven Sinofsky lamented Bing’s blockage in China, writing on Twitter that Microsoft had “worked so hard to be successful there.”

Tight seal

Bing remained one of the few non-Chinese internet firms that still have their core products up and running in a country where Google and Facebook have long been unavailable. Another rare case is LinkedIn, which runs a filtered version of its social network for professionals and caught flack for bending to local censorship.

Bing also censors its search service for Chinese users, so it would be odd if its inaccessibility proves to be a case of government clampdown. That said, China appears to be further tightening control over the cyberspace. Case in point, LinkedIn recently started to run strict identity checks on its China-based users.

Baidu remains the biggest search engine in China with smaller rival Sogou coming in second. Bing, which some users find is a more pleasant alternative to local options that are usually flooded with ads, is active on 320,000 unique devices monthly, according to third-party research firm iResearch. That’s dwarfed by Baidu’s 466 million and Sogou’s 43 million.

Google told the U.S. Congress in December it had no immediate plans to relaunch its search engine in China but felt “reaching out and giving users more information has a very positive impact.” The Mountain View-based firm shut down its search engine in mainland China back in 2010 under pressure over censorship but also cited cyber attacks as a factor in its decision to leave.

How open source software took over the world

It was just 5 years ago that there was an ample dose of skepticism from investors about the viability of open source as a business model. The common thesis was that Redhat was a snowflake and that no other open source company would be significant in the software universe.

Fast forward to today and we’ve witnessed the growing excitement in the space: Redhat is being acquired by IBM for $32 billion (3x times its market cap from 2014); Mulesoft was acquired after going public for $6.5 billion; MongoDB is now worth north of $4 billion; Elastic’s IPO now values the company at $6 billion; and, through the merger of Cloudera and Hortonworks, a new company with a market cap north of $4 billion will emerge. In addition, there’s a growing cohort of impressive OSS companies working their way through the growth stages of their evolution: Confluent, HashiCorp, DataBricks, Kong, Cockroach Labs and many others. Given the relative multiples that Wall Street and private investors are assigning to these open source companies, it seems pretty clear that something special is happening.

So, why did this movement that once represented the bleeding edge of software become the hot place to be? There are a number of fundamental changes that have advanced open source businesses and their prospects in the market.

David Paul Morris/Bloomberg via Getty Images

From Open Source to Open Core to SaaS

The original open source projects were not really businesses, they were revolutions against the unfair profits that closed-source software companies were reaping. Microsoft, Oracle, SAP and others were extracting monopoly-like “rents” for software, which the top developers of the time didn’t believe was world class. So, beginning with the most broadly used components of software – operating systems and databases – progressive developers collaborated, often asynchronously, to author great pieces of software. Everyone could not only see the software in the open, but through a loosely-knit governance model, they added, improved and enhanced it.

The software was originally created by and for developers, which meant that at first it wasn’t the most user-friendly. But it was performant, robust and flexible. These merits gradually percolated across the software world and, over a decade, Linux became the second most popular OS for servers (next to Windows); MySQL mirrored that feat by eating away at Oracle’s dominance.

The first entrepreneurial ventures attempted to capitalize on this adoption by offering “enterprise-grade” support subscriptions for these software distributions. Redhat emerged the winner in the Linux race and MySQL (thecompany) for databases. These businesses had some obvious limitations – it was harder to monetize software with just support services, but the market size for OS’s and databases was so large that, in spite of more challenged business models, sizeable companies could be built.

The successful adoption of Linux and MySQL laid the foundation for the second generation of Open Source companies – the poster children of this generation were Cloudera and Hortonworks. These open source projects and businesses were fundamentally different from the first generation on two dimensions. First, the software was principally developed within an existing company and not by a broad, unaffiliated community (in the case of Hadoop, the software took shape within Yahoo!) . Second, these businesses were based on the model that only parts of software in the project were licensed for free, so they could charge customers for use of some of the software under a commercial license. The commercial aspects were specifically built for enterprise production use and thus easier to monetize. These companies, therefore, had the ability to capture more revenue even if the market for their product didn’t have quite as much appeal as operating systems and databases.

However, there were downsides to this second generation model of open source business. The first was that no company singularly held ‘moral authority’ over the software – and therefore the contenders competed for profits by offering increasing parts of their software for free. Second, these companies often balkanized the evolution of the software in an attempt to differentiate themselves. To make matters more difficult, these businesses were not built with a cloud service in mind. Therefore, cloud providers were able to use the open source software to create SaaS businesses of the same software base. Amazon’s EMR is a great example of this.

The latest evolution came when entrepreneurial developers grasped the business model challenges existent in the first two generations – Gen 1 and Gen 2 – of open source companies, and evolved the projects with two important elements. The first is that the open source software is now developed largely within the confines of businesses. Often, more than 90% of the lines of code in these projects are written by the employees of the company that commercialized the software. Second, these businesses offer their own software as a cloud service from very early on. In a sense, these are Open Core / Cloud service hybrid businesses with multiple pathways to monetize their product. By offering the products as SaaS, these businesses can interweave open source software with commercial software so customers no longer have to worry about which license they should be taking. Companies like Elastic, Mongo, and Confluent with services like Elastic Cloud, Confluent Cloud, and MongoDB Atlas are examples of this Gen 3.  The implications of this evolution are that open source software companies now have the opportunity to become the dominant business model for software infrastructure.

The Role of the Community

While the products of these Gen 3 companies are definitely more tightly controlled by the host companies, the open source community still plays a pivotal role in the creation and development of the open source projects. For one, the community still discovers the most innovative and relevant projects. They star the projects on Github, download the software in order to try it, and evangelize what they perceive to be the better project so that others can benefit from great software. Much like how a good blog post or a tweet spreads virally, great open source software leverages network effects. It is the community that is the source of promotion for that virality.

The community also ends up effectively being the “product manager” for these projects. It asks for enhancements and improvements; it points out the shortcomings of the software. The feature requests are not in a product requirements document, but on Github, comments threads and Hacker News. And, if an open source project diligently responds to the community, it will shape itself to the features and capabilities that developers want.

The community also acts as the QA department for open source software. It will identify bugs and shortcomings in the software; test 0.x versions diligently; and give the companies feedback on what is working or what is not.  The community will also reward great software with positive feedback, which will encourage broader use.

What has changed though, is that the community is not as involved as it used to be in the actual coding of the software projects. While that is a drawback relative to Gen 1 and Gen 2 companies, it is also one of the inevitable realities of the evolving business model.

Linus Torvalds was the designer of the open-source operating system Linux.

Rise of the Developer

It is also important to realize the increasing importance of the developer for these open source projects. The traditional go-to-market model of closed source software targeted IT as the purchasing center of software. While IT still plays a role, the real customers of open source are the developers who often discover the software, and then download and integrate it into the prototype versions of the projects that they are working on. Once “infected”by open source software, these projects work their way through the development cycles of organizations from design, to prototyping, to development, to integration and testing, to staging, and finally to production. By the time the open source software gets to production it is rarely, if ever, displaced. Fundamentally, the software is never “sold”; it is adopted by the developers who appreciate the software more because they can see it and use it themselves rather than being subject to it based on executive decisions.

In other words, open source software permeates itself through the true experts, and makes the selection process much more grassroots than it has ever been historically. The developers basically vote with their feet. This is in stark contrast to how software has traditionally been sold.

Virtues of the Open Source Business Model

The resulting business model of an open source company looks quite different than a traditional software business. First of all, the revenue line is different. Side-by-side, a closed source software company will generally be able to charge more per unit than an open source company. Even today, customers do have some level of resistance to paying a high price per unit for software that is theoretically “free.” But, even though open source software is lower cost per unit, it makes up the total market size by leveraging the elasticity in the market. When something is cheaper, more people buy it. That’s why open source companies have such massive and rapid adoption when they achieve product-market fit.

Another great advantage of open source companies is their far more efficient and viral go-to-market motion. The first and most obvious benefit is that a user is already a “customer” before she even pays for it. Because so much of the initial adoption of open source software comes from developers organically downloading and using the software, the companies themselves can often bypass both the marketing pitch and the proof-of-concept stage of the sales cycle. The sales pitch is more along the lines of, “you already use 500 instances of our software in your environment, wouldn’t you like to upgrade to the enterprise edition and get these additional features?”  This translates to much shorter sales cycles, the need for far fewer sales engineers per account executive, and much quicker payback periods of the cost of selling. In fact, in an ideal situation, open source companies can operate with favorable Account Executives to Systems Engineer ratios and can go from sales qualified lead (SQL) to closed sales within one quarter.

This virality allows for open source software businesses to be far more efficient than traditional software businesses from a cash consumption basis. Some of the best open source companies have been able to grow their business at triple-digit growth rates well into their life while  maintaining moderate of burn rates of cash. This is hard to imagine in a traditional software company. Needless to say, less cash consumption equals less dilution for the founders.

Photo courtesy of Getty Images

Open Source to Freemium

One last aspect of the changing open source business that is worth elaborating on is the gradual movement from true open source to community-assisted freemium. As mentioned above, the early open source projects leveraged the community as key contributors to the software base. In addition, even for slight elements of commercially-licensed software, there was significant pushback from the community. These days the community and the customer base are much more knowledgeable about the open source business model, and there is an appreciation for the fact that open source companies deserve to have a “paywall” so that they can continue to build and innovate.

In fact, from a customer perspective the two value propositions of open source software are that you a) read the code; b) treat it as freemium. The notion of freemium is that you can basically use it for free until it’s deployed in production or in some degree of scale. Companies like Elastic and Cockroach Labs have gone as far as actually open sourcing all their software but applying a commercial license to parts of the software base. The rationale being that real enterprise customers would pay whether the software is open or closed, and they are more incentivized to use commercial software if they can actually read the code. Indeed, there is a risk that someone could read the code, modify it slightly, and fork the distribution. But in developed economies – where much of the rents exist anyway, it’s unlikely that enterprise companies will elect the copycat as a supplier.

A key enabler to this movement has been the more modern software licenses that companies have either originally embraced or migrated to over time. Mongo’s new license, as well as those of Elastic and Cockroach are good examples of these. Unlike the Apache incubated license – which was often the starting point for open source projects a decade ago, these licenses are far more business-friendly and most model open source businesses are adopting them.

The Future

When we originally penned this article on open source four years ago, we aspirationally hoped that we would see the birth of iconic open source companies. At a time where there was only one model – Redhat – we believed that there would be many more. Today, we see a healthy cohort of open source businesses, which is quite exciting. I believe we are just scratching the surface of the kind of iconic companies that we will see emerge from the open source gene pool. From one perspective, these companies valued in the billions are a testament to the power of the model. What is clear is that open source is no longer a fringe approach to software. When top companies around the world are polled, few of them intend to have their core software systems be anything but open source. And if the Fortune 5000 migrate their spend on closed source software to open source, we will see the emergence of a whole new landscape of software companies, with the leaders of this new cohort valued in the tens of billions of dollars.

Clearly, that day is not tomorrow. These open source companies will need to grow and mature and develop their products and organization in the coming decade. But the trend is undeniable and here at Index we’re honored to have been here for the early days of this journey.

Village Global’s accelerator introduces founders to Bill Gates, Reid Hoffman, Eric Schmidt and more

Village Global is leveraging its network of tech luminaries to support the next generation of entrepreneurs.

The $100 million early-stage venture capital firm, which counts as limited partners (LPs) Microsoft’s Bill Gates, Facebook’s Mark Zuckerberg, Alphabet’s Eric Schmidt, Amazon’s Jeff Bezos, LinkedIn’s Reid Hoffman and many other high-profile techies, quietly announced on Friday that the accelerator it piloted earlier this year would become a permanent fixture.

Called Network Catalyst, Village provides formation-stage startups with $150,000 and three-months of programming in exchange for 7 percent equity. Its key offering, however, is access to its impressive roster of LPs.

To formally announce Network Catalyst, Village brought none other than Bill Gates to San Francisco for a fireside chat with Eventbrite CEO Julia Hartz . During the hour-long talk, Gates handed out candid advice on building a successful company, insights on philanthropy and predictions on the future of technology. He later met individually with the founders of Village’s portfolio companies.

“I have a fairly hardcore view that there should be a very large sacrifice made during those early years,” Gates said. “In those early years, you need to have a team that’s pretty maniacal about the company.”

During the Q&A session, Gates regurgitated one of his great anecdotes. In the early days of Microsoft, he would memorize his employee’s license plates so he knew when they were coming and going, quietly noting who was working the longest hours. He admitted, to no one’s surprise, that he struggled with work-life balance.

“I think you can over worship the idea of working extremely hard,” he said. “For my particular makeup, it’s really true I didn’t believe in weekends or vacations … Once I got in my 30s, I could hardly imagine how I’d done that because by then some natural thing inside of me kicked in and I loved weekends and my girlfriend liked vacations and that turned out to be a great thing.”

Gates has been an active investor in Village since it emerged one year ago. VMware founder Diane Greene, Disney CEO Bob Iger and Spanx CEO Sara Blakely are also on the firm’s long list of LPs.

Village is led by four general partners: Erik Torenberg, Product Hunt’s first employee; LinkedIn’s former chief of staff Ben Casnocha; Chegg’s former chief business officer Anne Dwane; and former Canaan partner Ross Fubini. They initially filed to raise a $50 million fund in mid-2017 but ultimately closed on $100 million in March. The firm relies heavily on scouts — angel investors and others knowledgeable of the startup world — to source deals. The scouts, in return, earn a portion of the firm’s returns.

Former Alphabet chairman Eric Schmidt.

An accelerator program has been part of Village’s plan since the beginning.

Pinterest CEO Ben Silbermann, Fidelity CEO Abby Johnson, Hoffman, Iger, Blakely and Schmidt all worked with Network Catalyst’s debut cohort of founders. Village co-founder Anne Dwane said Hoffman and former Twitter CEO Ev Williams have signed on to work with the next cohort.

“It is about contacts, not content,” Dwane told TechCrunch. “The most important thing is who you can meet to help you take your business forward.”

San Francisco-based VeriSIM, a startup building AI-enabled biosimulation models, was among the debut class of companies that participated in Network Catalyst. Jo Varshney, the company’s founder and CEO, said the accelerator’s personalization and customization set it apart from competing options.

“It seemed like I had a team of people working alongside me even though I’m a solo founder,” Varshney told TechCrunch.

After completing the program, Schmidt introduced Varshney to a number of investors. She quickly closed a $1.5 million seed round.

“One year in and I already have a one-on-one meeting with Bill Gates,” she added.

Applications for the accelerator close on December 7 with programming kicking off January 14. Village plans to enroll at least 12 companies across industries.

You’ll now need a subscription to get the best of Microsoft Office

Microsoft released Office 2019 for Windows and macOS this week, the latest version of its regular, non-subscription productivity suite. It’s the kind of Office that, ten years ago, you would’ve bought in a shrink-wrapped package at Office Depot. But it’s really not the version of Office that Microsoft would like you to buy — or that you probably want to have. That’s because at this point, Office 2019 is basically a limited version that doesn’t include the most interesting new features of its Office 365 subscription counterpart.

“We are really working very hard to position Office 365 in all its flavors — ProPlus for the commercial users — as very different from these versions of Office that have a year number in them,” Microsoft’s corporate VP for Office and Windows Jared Spataro told me. “Office 2019, all the features that we released in it, had previously been released in Office 365. So are our way of talking about the cloud versions of Office 365 is that they’re connected, that this breathes life into them.”

Spataro also noted that Microsoft wants users to remember that the connected Office 365 apps will offer higher productivity because of their cloud connectivity and a higher degree of security. He also argues that these versions deliver a lower total cost of ownership.

Back when Microsoft launched Office 2016, those releases were essentially snapshots (‘carbon copies,” Spataro called them) of the regularly updated Office 365 versions, which get monthly updates and feature releases. For the first time now, the on-premises version of Office only provides a subset of the full functionality, with a lot of missing functionality because virtually all of the most interesting new features — including all the machine learning smarts that are now rolling out to Office 365 — will be missing from Office 2019.

“I think there will be some confusion,” Spataro acknowledged. “It’ll take us some time to train people that the year number doesn’t mean it’s the best version.”

In a way, though, this makes sense, given that a lot of the new functionality that Microsoft is now building into Office 365 only works because it’s connected to the cloud. That’s the only way to pull in data for the new Microsoft Search functionality, for example, and to run the machine learning models and pull in data from those — and Microsoft has decided that the best way to charge for those is through a subscription.

Microsoft’s strategy isn’t all that different from Adobe’s, for example, which now focuses on its Creative Cloud subscriptions and the cloud features that come with those to promote its subscription service over shrink-wrapped versions of its applications. That has been a very successful transition for Adobe and Microsoft is looking for the same with Office 365 (and its Microsoft 365 counterpart).

Microsoft Office gets smarter

more Microsoft Ignite 2018 coverage

Chef launches deeper integration with Microsoft Azure

DevOps automation service Chef today announced a number of new integrations with Microsoft Azure. The news, which was announced at the Microsoft Ignite conference in Orlando, Florida, focuses on helping enterprises bring their legacy applications to Azure and ranges from the public preview of Chef Automate Managed Service for Azure to the integration of Chef’s InSpec compliance product with Microsoft’s cloud platform.

With Chef Automate as a managed service on Azure, which provides ops teams with a single tool for managing and monitoring their compliance and infrastructure configurations, developers can now easily deploy and manage Chef Automate and the Chef Server from the Azure Portal. It’s a fully managed service and the company promises that businesses can get started with using it in as little as thirty minutes (though I’d take those numbers with a grain of salt).

When those configurations need to change, Chef users on Azure can also now use the Chef Workstation with Azure Cloud Shell, Azure’s command line interface. Workstation is one of Chef’s newest products and focuses on making ad-hoc configuration changes, no matter whether the node is managed by Chef or not.

And to remain in compliance, Chef is also launching an integration of its InSpec security and compliance tools with Azure. InSpec works hand in hand with Microsoft’s new Azure Policy Guest Configuration (who comes up with these names?) and allows users to automatically audit all of their applications on Azure.

“Chef gives companies the tools they need to confidently migrate to Microsoft Azure so users don’t just move their problems when migrating to the cloud, but have an understanding of the state of their assets before the migration occurs,” said Corey Scobie, the senior vice president of products and engineering at Chef, in today’s announcement. “Being able to detect and correct configuration and security issues to ensure success after migrations gives our customers the power to migrate at the right pace for their organization.”

more Microsoft Ignite 2018 coverage

Why the Pentagon’s $10 billion JEDI deal has cloud companies going nuts

By now you’ve probably heard of the Defense Department’s massive winner-take-all $10 billion cloud contract dubbed the Joint Enterprise Defense Infrastructure (or JEDI for short).
Star Wars references aside, this contract is huge, even by government standards.The Pentagon would like a single cloud vendor to build out its enterprise cloud, believing rightly or wrongly that this is the best approach to maintain focus and control of their cloud strategy.

Department of Defense (DOD) spokesperson Heather Babb tells TechCrunch the department sees a lot of upside by going this route. “Single award is advantageous because, among other things, it improves security, improves data accessibility and simplifies the Department’s ability to adopt and use cloud services,” she said.

Whatever company they choose to fill this contract, this is about modernizing their computing infrastructure and their combat forces for a world of IoT, artificial intelligence and big data analysis, while consolidating some of their older infrastructure. “The DOD Cloud Initiative is part of a much larger effort to modernize the Department’s information technology enterprise. The foundation of this effort is rationalizing the number of networks, data centers and clouds that currently exist in the Department,” Babb said.

Setting the stage

It’s possible that whoever wins this DOD contract could have a leg up on other similar projects in the government. After all it’s not easy to pass muster around security and reliability with the military and if one company can prove that they are capable in this regard, they could be set up well beyond this one deal.

As Babb explains it though, it’s really about figuring out the cloud long-term. “JEDI Cloud is a pathfinder effort to help DOD learn how to put in place an enterprise cloud solution and a critical first step that enables data-driven decision making and allows DOD to take full advantage of applications and data resources,” she said.

Photo: Mischa Keijser for Getty Images

The single vendor component, however, could explain why the various cloud vendors who are bidding, have lost their minds a bit over it — everyone except Amazon, that is, which has been mostly silent, happy apparently to let the process play out.

The belief amongst the various other players, is that Amazon is in the driver’s seat for this bid, possibly because they delivered a $600 million cloud contract for the government in 2013, standing up a private cloud for the CIA. It was a big deal back in the day on a couple of levels. First of all, it was the first large-scale example of an intelligence agency using a public cloud provider. And of course the amount of money was pretty impressive for the time, not $10 billion impressive, but a nice contract.

For what it’s worth, Babb dismisses such talk, saying that the process is open and no vendor has an advantage. “The JEDI Cloud final RFP reflects the unique and critical needs of DOD, employing the best practices of competitive pricing and security. No vendors have been pre-selected,” she said.

Complaining loudly

As the Pentagon moves toward selecting its primary cloud vendor for the next decade, Oracle in particular has been complaining to anyone who will listen that Amazon has an unfair advantage in the deal, going so far as to file a formal complaint last month, even before bids were in and long before the Pentagon made its choice.

Photo: mrdoomits for Getty Images (cropped)

Somewhat ironically, given their own past business model, Oracle complained among other things that the deal would lock the department into a single platform over the long term. They also questioned whether the bidding process adhered to procurement regulations for this kind of deal, according to a report in the Washington Post. In April, Bloomberg reported that co-CEO Safra Catz complained directly to the president that the deal was tailor made for Amazon.

Microsoft hasn’t been happy about the one-vendor idea either, pointing out that by limiting itself to a single vendor, the Pentagon could be missing out on innovation from the other companies in the back and forth world of the cloud market, especially when we’re talking about a contract that stretches out for so long.

As Microsoft’s Leigh Madden told TechCrunch in April, the company is prepared to compete, but doesn’t necessarily see a single vendor approach as the best way to go. “If the DOD goes with a single award path, we are in it to win, but having said that, it’s counter to what we are seeing across the globe where 80 percent of customers are adopting a multi-cloud solution,” he said at the time.

He has a valid point, but the Pentagon seems hell bent on going forward with the single vendor idea, even though the cloud offers much greater interoperability than proprietary stacks of the 1990s (for which Oracle and Microsoft were prime examples at the time).

Microsoft has its own large DOD contract in place for almost a billion dollars, although this deal from 2016 was for Windows 10 and related hardware for DOD employees, rather than a pure cloud contract like Amazon has with the CIA.

It also recently released Azure Stack for government, a product that lets government customers install a private version of Azure with all the same tools and technologies you find in the public version, and could prove attractive as part of its JEDI bid.

Cloud market dynamics

It’s also possible that the fact that Amazon controls the largest chunk of the cloud infrastructure market, might play here at some level. While Microsoft has been coming fast, it’s still about a third of Amazon in terms of market size, as Synergy Research’s Q42017 data clearly shows.

The market hasn’t shifted dramatically since this data came out. While market share alone wouldn’t be a deciding factor, Amazon came to market first and it is much bigger in terms of market than the next four combined, according to Synergy. That could explain why the other players are lobbying so hard and seeing Amazon as the biggest threat here, because it’s probably the biggest threat in almost every deal where they come up against each other, due to its sheer size.

Consider also that Oracle, which seems to be complaining the loudest, was rather late to the cloud after years of dismissing it. They could see JEDI as a chance to establish a foothold in government that they could use to build out their cloud business in the private sector too.

10 years might not be 10 years

It’s worth pointing out that the actual deal has the complexity and opt-out clauses of a sports contract with just an initial two-year deal guaranteed. A couple of three-year options follow, with a final two-year option closing things out. The idea being, that if this turns out to be a bad idea, the Pentagon has various points where they can back out.

Photo: Henrik Sorensen for Getty Images (cropped)

In spite of the winner-take-all approach of JEDI, Babb indicated that the agency will continue to work with multiple cloud vendors no matter what happens. “DOD has and will continue to operate multiple clouds and the JEDI Cloud will be a key component of the department’s overall cloud strategy. The scale of our missions will require DOD to have multiple clouds from multiple vendors,” she said.

The DOD accepted final bids in August, then extended the deadline for Requests for Proposal to October 9th. Unless the deadline gets extended again, we’re probably going to finally hear who the lucky company is sometime in the coming weeks, and chances are there is going to be lot of whining and continued maneuvering from the losers when that happens.

Cryptocurrency mining attacks using leaked NSA hacking tools are still highly active a year later

It’s been over a year since highly classified exploits built by the National Security Agency were stolen and published online.

One of the tools, dubbed EternalBlue, can covertly break into almost any Windows machine around the world. It didn’t take long for hackers to start using the exploits to run ransomware on thousands of computers, grinding hospitals and businesses to a halt. Two separate attacks in as many months used WannaCry and NotPetya ransomware, which spread like wildfire. Once a single computer in a network was infected, the malware would also target other devices on the network. The recovery was slow and cost companies hundreds of millions in damages.

Yet, more than a year since Microsoft released patches that slammed the backdoor shut, almost a million computers and networks are still unpatched and vulnerable to attack.

Although WannaCry infections have slowed, hackers are still using the publicly accessible NSA exploits to infect computers to mine cryptocurrency.

Nobody knows that better than one major Fortune 500 multinational, which was hit by a massive WannaMine cryptocurrency mining infection just days ago.

US Treasury sanctions North Korea over Sony hack and WannaCry attack

“Our customer is a very large corporation with multiple offices around the world,” said Amit Serper, who heads the security research team at Boston-based Cybereason.

“Once their first machine was hit the malware propagated to more than 1,000 machines in a day,” he said, without naming the company.

Cryptomining attacks have been around for a while. It’s more common for hackers to inject cryptocurrency mining code into vulnerable websites, but the payoffs are low. Some news sites are now installing their own mining code as an alternative to running ads.

But WannaMine works differently, Cybereason said in its post-mortem of the infection. By using those leaked NSA exploits to gain a single foothold into a network, the malware tries to infect any computer within. It’s persistent so the malware can survive a reboot. After it’s implanted, the malware uses the computer’s processor to mine cryptocurrency. On dozens, hundreds, or even thousands of computers, the malware can mine cryptocurrency far faster and more efficiently. Though it’s a drain on energy and computer resources, it can often go unnoticed.

After the malware spreads within the network, it modifies the power management settings to prevent the infected computer from going to sleep. Not only that, the malware tries to detect other cryptomining scripts running on the computer and terminates them — likely to squeeze every bit of energy out of the processor, maximizing its mining effort.

At least 300,000 computers or networks are still vulnerable to the NSA’s EternalBlue hacking tools.

Based on up-to-date statistics from Shodan, a search engine for open ports and databases, at least 919,000 servers are still vulnerable to EternalBlue, with some 300,000 machines in the US alone. And that’s just the tip of the iceberg — that figure can represent either individual vulnerable computers or a vulnerable network server capable of infecting hundreds or thousands more machines.

Cybereason said companies are still severely impacted because their systems aren’t protected.

“There’s no reason why these exploits should remain unpatched,” the blog post said. “Organizations need to install security patches and update machines.”

If not ransomware yesterday, it’s cryptomining malware today. Given how versatile the EternalBlue exploit is, tomorrow it could be something far worse — like data theft or destruction.

In other words: if you haven’t patched already, what are you waiting for?

The Xbox Adaptive Controller goes on sale today and is also now part of the V&A museum’s collection

In an important move for inclusion in the gaming community, the Xbox Adaptive Controller, created for gamers with mobility issues, is now on sale. The Victoria and Albert Museum (V&A) also announced today that it has acquired the Xbox Adaptive Controller for display in its Rapid Response gallery dedicated to current events and pop culture.

First introduced in May, the Xbox Adaptive Controller can now be purchased online for $99.99. To create the controller, Microsoft collaborated with gamers with disabilities and limited mobility, as well as partners from several organizations, including the AbleGamers Charity, the Cerebral Palsy Foundation, Special Effect and Warfighter Engaged.

According to Microsoft, the Xbox Adaptive Controller project first took root in 2014 when one of its engineers spotted a custom gaming controller made by Warfighter Engaged, a non-profit that provides gaming devices for wounded and disabled veterans. During several of Microsoft’s hackathons, teams of employees began working on gaming devices for people with limited mobility, which in turn gave momentum to the development of the Xbox Adaptive Controller.

In its announcement, the V&A said it added the Xbox Adaptive Controller to its collection because “as the first adaptive controller designed and manufactured at large-scale by a leading technology company, it represents a landmark moment in videogame play, and demonstrates how design can be harnessed to encourage inclusively and access.”

The Xbox Adaptive Controller features two large buttons that can be programmed to fit its user’s needs, as well as 19 jacks and two USB ports that are spread out in a single line on the back of the device to make them easier to access. Symbols embossed along the back of the controller’s top help identify ports so gamers don’t have to turn it around or lift it up to find the one they need, while grooves serve as guidelines to help them plug in devices. Based on gamer feedback, Microsoft moved controls including the D Pad to the side of the device and put the A and B buttons closer together, so users can easily move between them with one hand.

The controller slopes down toward the front, enabling gamers to slide their hands onto it without having to lift them (and also makes it easier to control with feet) and has rounded edges to reduce the change of injury if it’s dropped on a foot. The Xbox Adaptive Controller was designed to rest comfortably in laps and also has three threaded inserts so it can be mounted with hardware to wheelchairs, lap boards or desks.

In terms of visual design, the Xbox Adaptive Controller is sleek and unobtrusive, since Microsoft heard from many gamers with limited mobility that they dislike using adaptive devices because they often look like toys. The company’s attention to detail also extends into the controller’s packaging, which is very easy to unbox because gamers told Microsoft that they are often forced to open boxes and other product packages with their teeth.