Last week, Google announced details of Always Free, a free tier of Google Cloud Platform services that allow users to obtain familiarity with the Google Cloud Platform’s suite of offerings. The free tier of services allows users to take advantage of 15 Google Cloud Platform services that include Google Compute Engine, Google App Engine, Google Cloud Datastore, Google Cloud Functions, Google Stackdriver, Google BigQuery Public Datasets and Google Container Engine. In the case of Google Compute Engine, users have access to one 1 f1-micro instance each month, with the additional constraint that they can run a maximum of 8 cores of virtual CPUs concurrently as noted below, by the Google Cloud Platform:
You can have no more than 8 cores (or Virtual CPUs) running at the same time. For example, you can launch eight n1-standard-1 machines, or two n1-standard-4 machines, but you can’t launch a n1-standard-16machine. For more information about the types of virtual machines available and the number of cores they use, see Machine type pricing.
The availability of Google Cloud Platform is limited to U.S. regions and a 30 GB-months HDD and a 5 GB-months snapshot. As part of its Always Free tier, Google also elaborated details of a $300 credit that customers can apply to the usage of Google Cloud Platform products to further augment their ability to experiment with the capabilities of GCP services. The $300 credit applies to all Google Cloud Platform products and spans a duration of 12 months.
Announced by Sam Ramji, VP of Product Development at Google Cloud at the Google Cloud Next conference on Friday March 10, the Always Free Tier and the $300 credit represent an important sales and marketing initiative designed to lure new customers intro trying the features and functionality of the Google Cloud Platform. As enterprises increasingly leverage a multi-cloud strategy characterized by the use of multiple public clouds in an effort to minimize threats posed by vendor lock-in and the effects of cloud outages, Google Cloud Platform’s Always Free tier promises to increase its market share in a field dominated by Amazon Web Services but that additionally includes Microsoft Azure, Oracle and IBM. Meanwhile, Google’s ability to onboard new customers via its Always Free tier raises the obvious question around its ability to retain those customers in collaboration with an aggressive sales and customer satisfaction team capable of eliciting and responding to the needs of its growing customer base.
As noted by Urs Holzle, SVP of Google Cloud Infrastructure in a recent blog post, Google Cloud Platform has adopted the Intel Xeon Processor Skylake, a next generation micro-processor optimized for high performance compute workloads. Skylake’s Intel Advanced Vector Extensions render it exceptionally well suited for 3 dimensional modeling, compute-intensive data analytics, genomics, scientific modeling and engineering simulations. Moreover, Google Cloud Infrastructure has optimized Skylake for Google’s VMs to ensure Google Cloud Platform customers receive maximum benefit from Skylake’s capabilities. The availability of the Intel Xeon Processor Skylake on the Google Cloud Platform marks the fulfillment of a promise made by Google and Intel in November to integrate Intel’s most recent microprocessor into the Google Cloud Platform. Google Cloud Platform now claims the distinction of the first public cloud to use the Intel Xeon Processor Skylake in what amounts to a hugely important deal for Intel as it faces increased competition from AMD, particularly given the forthcoming launch of the AMD Ryzen microprocessor. Meanwhile, Google stands to benefit from Skylake’s ability to enhance customer capabilities with respect to applications requiring more intensive computational power and modeling capabilities. Importantly, the availability of the Xeon Skylake Processor on the Google Cloud Platform illustrates a broader partnership between Google and Intel aimed at accelerating cloud adoption for enterprises.
Google has announced details of a key management service in Beta in select countries that allows enterprises to manage the encryption keys for their cloud-based deployments. The ability of Google Cloud customers to manage their own encryption keys enhances the cloud security of Google’s public cloud platform because customers now have the option of taking ownership of the encryption keys for cloud deployments. Branded Google Cloud Key Management Service (Cloud KMS), Google’s expanded encryption functionality gives it parity with the AWS Key Management service and the Azure Key Vault with respect to customer-owned encryption options. Customers interested in retaining control over their encryption keys have the choice to store the encryption keys in the cloud or on premise. Google’s ability to give encryption keys to its customers is enabled by technology that “uses the Advanced Encryption Standard (AES), in Galois/Counter Mode (GCM), the same encryption library used internally at Google to encrypt data in Google Cloud Storage,” as noted in a blog post.
Google Cloud Platform’s decision to give customers the option of controlling their encryption keys puts it on par with its competitors AWS and Azure while concurrently satisfying the cloud security needs of customers in highly regulated industries such as healthcare and finance, that typically require greater ownership of the mechanism of encryption and de-encryption. Cloud security promises to be an intense area of interest in 2017 and Google’s achievement in coming up to speed with two its key competitors with respect to encryption functionality points to the tip of the iceberg of a broader conversation about cloud security that stands to unfold over the next 12 to 18 months. Given the dramatic proliferation of high profile cloud security breaches in recent months, expect Google Cloud Platform, AWS and Azure to keep enhancing their cloud security options in 2017, particularly since cloud security could represent the game-changer for cloud market share in the public and hybrid cloud space.
On December 27, Google Cloud Platform announced that Stackdriver Trace (Trace), its tool for analyzing application latency and performance, now supports limited interoperability with Zipkin, a distributed tracing system open-sourced by Twitter in 2012. Trace can now receive traces regarding application performance from Zipkin as a result of Google’s recent release of a Zipkin server. Trace’s ability to receive traces from customers using Zipkin enables customers to leverage the power of Trace for applications written in languages that Trace currently does not support, as well as in conjunction with Zipkin. Trace currently supports Google App Engine-native applications and features libraries to support applications that run on VMs or containers that are written in Node.js, Java and Go. Trace plans to support Ruby and .NET in the near future and, as such, its ability to receive traces from Zipkin via the newly released Zipkin server expands the universe of applications whose latency and performance it can analyze and also allows users to compare the relative merits of Trace and Zipkin. Trace’s ability to accept traces from Zipkin marks a notable step forward in the space of technologies dedicated to understanding application latency and the root causes of performance degradation given that Salesforce and Yelp support Zipkin, alongside Twitter. Expect Google to continue augmenting Trace as it gains even more traction from Zipkin-users and conversely, for Zipkin to evolve as a result of the insights delivered through its compatibility with Stackdriver Trace.
Google has acquired Qwiklabs, the company behind an educational platform geared toward helping users understand cloud computing and how to write cloud-native applications. Launched in 2012, Qwiklabs has focused on helping users obtain training for Amazon Web Services, but Google plans to adapt it to facilitate the delivery of education related to the Google Cloud Platform and its associated G-Suite of applications. Google’s acquisition of Qwiklabs underscores the heterogeneity of products and services surrounding the cloud computing revolution and illustrates the urgency of the tech industry’s need for quality training and educational products specific to cloud computing and big data. Expect educational platforms related to cloud computing to proliferate as cloud adoption continues to skyrocket and fuel a need for educational materials that can facilitate the training of end users used on the rapidly evolving space of cloud technologies and platforms. Qwiklabs claims that over 500,000 users have received 5 million hours of training on its platform thus far. Terms of the acquisition were not disclosed.
Amazon reported earnings per share of 52 cents on Thursday, missing the earnings target of 78 eps predicted by analysts by a margin that, in combination with other data points from the earnings report, subsequently sent the stock plummeting by 5% in trading on Friday. For the quarter ending September 30, 2016, the company reported revenue of $32.71 billion that slightly exceeded Wall Street estimates of $32.69 billion. Meanwhile, Amazon’s revenue guidance for the fourth quarter between $42 billion and $45.5 billion leaned toward the lower side of the spectrum of Wall Street’s expectation of $44.58 billion. To make matters even more worrisome for investors, Amazon projected operating income of between zero and $1.25 billion for the fourth quarter, whereas Wall Street had projected $1.62 billion. On a positive note, the company’s cloud services business unit, Amazon Web Services, claimed revenue of $3.23 billion, an increase of 55% in comparison to the $2.08 billion from the third quarter of last year that surpassed Wall Street’s expectation of $3.17 billion. Amazon explained its less than stellar earnings report by noting its heavy investments in original video content for Amazon Prime as well as fulfillment centers. Nevertheless, Amazon’s earnings per share miss and third quarter results more generally raised eyebrows in both the technology and investor community after a year of impressive growth and the preservation of its lead in the cloud computing space despite intensified competition. Axcient CEO Justin Moore remarked on Amazon’s recent earnings miss as follows:
Despite the Q3 EPS miss, over the longer-term, Amazon will continue to be a dominant force in both e-commerce and enterprise infrastructure – an incredible feat given that the customer sets are on the opposite ends of the spectrum. Amazon has been very clear that it will continue to focus on growth and not profitability. Investors have signed up for this approach for years so the blip in the stock will be tempered. Bezos has Amazon ‘primed’ for a dominant push to 2020 – and beyond. AWS and Prime continue to be Amazon’s primary growth and revenue drivers as the Seattle company broadens its lead in online commerce and cloud-computing services. The only real question for Amazon comes down to two factors: 1) its ability to appease investors appetite for ongoing record growth and 2) can it continue to maintain its lead over Microsoft, Google and Oracle who are equally committed to winning the cloud and have the benefit of being second mover which can be a benefit in these situation as infrastructure ages out and size and scale become inhibitors to innovation and performance. Expect to see all leverage M&A to acquire their way to technical leadership and hold an edge over the competition. That said, while there is plenty of startup talent to be bought at a premium, I don’t see Amazon losing this race anytime soon.
Here, Moore opines that Amazon’s ability to manage investor expectations and shake off the “second mover” advantage had by competitors such as Microsoft, Google and Oracle will determine whether it can continue the dominance in “e-commerce and enterprise infrastructure” that it has delivered, to date. Moore also notes that second movers stand to benefit from their ability to outpace the “size and scale” of their competitors with enhanced agility and innovation. Herein lies the stakes of Bezos’s gamble on innovation and investment in Amazon’s infrastructure: if Amazon can, indeed, afford to innovate on the rapidly expanding scale of its business and cloud operations with the agility of its competitors by re-investing resources acquired through its meteoric growth to date, Amazon stands poised to radically reconfigure the technology landscape over the next ten years in ways analogous to the disruption that Amazon Web Services brought to the cloud computing landscape. But in the event that the size and complexity of Amazon’s infrastructure mitigates against its ability to continue to deliver innovation, the chances of competitors such as Oracle and Google catching up to it, at least on the cloud services front, increase dramatically. According to Amazon’s CFO, Brian Olsavsky, Amazon built 18 new fulfillment centers in the third quarter while investing heavily in video content to enable Amazon Prime’s video services offerings to compete with Netflix. With respect to Amazon Web Services, however, one obvious question investors may have following last week’s earnings report concerns how Amazon intends to invest in AWS in response to Google’s rebranding of its cloud-based products and services coupled with Google’s aggressive emphasis on professional services for the enterprise.
Pivotal and the Google Cloud Platform have announced a collaboration whereby Pivotal Cloud Foundry, the platform as a service based on the open source Cloud Foundry project, will now be generally available on the Google Cloud Platform. Pivotal’s collaboration with Google builds upon existing partnerships with Amazon Web Services and Microsoft Azure and gives it expanded access to developers building applications on cloud-based platforms. Key perks of using the Google Cloud Platform for Pivotal Cloud Foundry applications include access to Google Cloud’s load balancing technology as well as Google’s data and machine learning services such as Google BigQuery, Google Cloud SQL and Google Cloud Natural Language API. The availability of Google’s data and machine learning services testifies to an impressive depth of integration between Pivotal Cloud Foundry and the Google Cloud Platform, one that was made available by custom-built service brokers created by Google’s engineering team. The ability to create Pivotal Cloud Foundry-based apps on the Google Cloud Platform, with full access to Google’s enviable roster of data and machine learning products, gives developers a rich portfolio of battle-tested building blocks from which to build and iteratively enhance their applications. Stay tuned to the cadences of the integration between Pivotal Cloud Foundry and the Google Cloud Platform to understand whether the integration of the two platforms renders Google Cloud Platform a more promising partner for Pivotal Cloud Foundry developers and its customers than Amazon Web Services and Microsoft Azure. In the here and now, however, Pivotal—which is part of Dell Technologies—stands positioned to expand its availability to enterprise customers via a partnership that differentiates by way of access to Google’s renowned Big Data and machine learning technologies.