Posted:
Cross-posted from the Google Analytics Blog

With 10.6 million cell phone customers and retail stores in 400+ markets, U.S. Cellular needs to reach a lot of people with marketing messages. That's why U.S. Cellular uses many marketing channels -- online, in-store and telesales -- to drive mobile phone activations.

U.S. Cellular was challenged though. They didn’t know how many of their offline sales were driven by their digital marketing. This made it harder to adjust their media mix accordingly and also to forecast sales. To fix that situation, U.S. Cellular and its digital-analytics firm, Cardinal Path, turned to Google Analytics Premium and its integration with Google BigQuery.

Part of Google Cloud Platform, BigQuery allows for highly flexible analysis of large datasets. The U.S. Cellular team used it to integrate and analyze terabytes of data from Google Analytics Premium and other systems. Then they mapped consumer behavior across online and offline marketing channels. Each transaction was attributed to the consumer touchpoints that the buyer had made across various sales channels.

The result: U.S. Cellular got real insight into digital’s role in their sales. They were surprised to find that they could reclassify nearly half of all their offline activations to online marketing channels.

U.S. Cellular now uses this complete (and fully automatic) analytics framework to really see the consumer journey and forecast sales for each channel. Their team has the data they need to make better business decisions.

“We’re now in the enviable position of having an accurate view at each stage of our customer journey," says Katie Birmingham, a digital & e-commerce analyst for the company. "The Google Analytics Premium solution not only gives us a business advantage, but helps us shape a great customer experience, and ultimately ties in to our values of industry-leading innovation and world-class customer service.”

Be sure to read the full case study.

-Posted by: Suzanne Mumford, Google Analytics Premium Marketing

Posted:
Editor’s note: Today’s guest blog comes from Ron Zalkind, co-founder and CTO of Waltham, Massachusetts-based CloudLock, a leading cloud security provider. The largest organizations in the world trust CloudLock to secure their data in the cloud, increase collaboration, and reduce their risk.

At a time when more and more organizations are moving their most sensitive data assets and applications to the cloud, security takes center stage, often at the price of user productivity. At CloudLock, we believe that each and every organization should work hard to protect their data and users in the cloud, not from it. With this philosophy in mind, CloudLock provides cloud security applications that help over 700 businesses using Google Apps and Salesforce to enforce regulatory, operational and security compliance.

We’ve been building enterprise products on top of Google Cloud Platform, specifically Google App Engine, for four years now. Collaborating with Google early on allowed us to leverage its best-of-class infrastructure security and scalability - both paramount for us as a security provider.

Our business must be as agile as our customers. Their accelerated SaaS platform adoption means that our business is data intensive, continuously processing changes in billions of objects and thousands of third-party applications connected to Google Apps user accounts. We’ve built a massive real-time data processing solution using App Engine, enabling us to focus on delivering core value for customers instead of focusing on infrastructure. App Engine’s auto-scaling capabilities provide our customers with a security solution that grows with their business, and its development features enable us to release code improvements frequently and seamlessly.

As a SaaS company, high-quality service is our bond. To keep delivering best-in-class service our team leverages central management features available through the admin panel such as Google BigQuery and Google Cloud Storage for advanced service analysis, monitoring and delivery. We have also been using Premiere Support since its launch, which boosts our ability to provide enterprise-level customer support.

As a premier Google Apps partner, we rely on Cloud Platform to provide enterprise-class cloud security to over five million users. It’s a beautiful synergy.

-Posted by Ron Zalkind, co-founder and CTO of CloudLock

Posted:
A new, free Udacity online course, Developing Scalable Apps with Google App Engine, helps Java developers learn how to build scalable App Engine applications. As you work through the course, you'll build a conference management application that lets users create and query conferences.

magnus pizza boxes 5-cropped.png

The course starts with an entertaining introduction to Platform as a Service(PaaS). Magnus Hyttsten, Google Developer Advocate, discusses the evolution of server-side computing, from apps that could run on a computer under your desk, to applications that require the computing power of a data center. (This is not without disadvantages, as he points out. "It is no longer possible to warm your feet on the fan outlet.")

Urs Hölzle, Senior VP of infrastructure at Google, gives the background on Google Cloud Platform: "We built our internal cloud a long time ago, for our own needs. We had very large-scale applications, so we needed a very capable cloud, and now we're making our cloud available to everyone. In Cloud platform, App Engine is the one system that makes it really easy for you to start very small and then scale to a very large user base."

just the sandwich 3.png

After learning about the evolution of the data center and the need for scalability, you'll get right down to business and learn how to store data in the Datastore, use Memcache to speed up responses and cut down on Datastore quota usage, write queries, understand indexes, and use queues for tasks that execute outside front end requests.

Along the way, you'll build a backend App Engine application that uses Google Cloud Endpoints to expose its API to other applications.


conf-central-code.png

You'll learn how to implement Endpoints to make the API available externally, and how to use the Endpoints API from an Android application.

If you take this course, you'll not only learn about App Engine, but you'll use it behind the scenes too. Udacity uses App Engine to serve its online courses. Mike Sokolsky, Udacity co-founder and CTO, talks about Udacity's decision to use App Engine to host Udacity's MOOCs. He says, "It pushes you in the right direction. It pushes you to the best design practices for building a scalable application." And that's what this course aims to help you do, too.

You can take the course, Developing Scalable Applications with App Engine, at www.udacity.com/course/ud859.

The full course materials — all the videos, quizzes, and forums — are available for free for all students by selecting “View Courseware”. Personalized ongoing feedback and guidance from coaches is also available to anyone who chooses to enroll in Udacity’s guided program.

For more courses that Google and Udacity are developing together, see www.udacity.com/google.

"
-Posted by Jocelyn Becker, Developer Advocate

Posted:
Starting today, Google Cloud Monitoring Read API is generally available, allowing you to programmatically access metric data from your running services, such as CPU usage or disk IO. For example, you can use Cloud Monitoring Read API with Nagios to plug in to your existing alerting/event framework, or use it with Graphite to combine the data with your existing graphs. Third party providers can also use the API to integrate Google Cloud Platform metrics into their own monitoring services.

Cloud Monitoring Read API allows you to query current and historical metric data for up to the past 30 days. Also, you can use labels to filter data to more specific metrics (e.g. zones). Currently Cloud Monitoring Read API supports reading metric time series data from the following Cloud Platform services:
  • Google Compute Engine - 13 metrics
  • Google Cloud SQL - 12 metrics
  • Google Cloud Pub/Sub - 14 metrics

Our documentation provides a full list of supported metrics. Over time we will be adding support for more Cloud Platform services metrics and enhancing the metrics for existing services. You can see an example of usage and try these metrics for yourself on our getting started page. For samples and libraries, click here.

Example: getting CPU usage time series data
GET \
https://www.googleapis.com/cloudmonitoring/v2beta1/ \  # Access API
projects/YOUR_PROJECT_NAME/ \                          # For YOUR_PROJECT_NAME
timeseries/ \                                          # get time series of points
compute.googleapis.com%2Finstance%2Fcpu%2Fusage_time?\ # of CPU usage
youngest=2014-07-11T10%3A29%3A53.108Z& \           # with this latest timestamp
key={YOUR_API_KEY}                                     # using this API key
Your feedback is important!
We look forward to receiving feedback and suggestions at cloud-monitoring-feedback@googlegroups.com.

-Posted by Amir Hermelin, Product Manager

Posted:
Today’s guest blog comes from Rafael Sanches, engineer at Allthecooks, a social media platform for people who love cooking. Allthecooks is available on Android, iPhone, Windows Phone, Google Glass and Android Wear, and is a top recipe app on Google Play.

At Allthecooks, we’re connecting passionate chefs with casual and first-time cooks on every major mobile device including Android phones, tablets, watches and Google Glass. People use our app to find, rate and comment on dishes they can cook themselves, or post their own ideas complete with directions, ingredients, servings and nutrition information. We have chefs with tens of thousands of followers on Allthecooks, meaning that whenever they make an update we have to process hundreds of thousands of simultaneous API requests to feed that information to timelines of all their followers.

Creating a successful platform isn’t just about speed, it’s about scalability, too. Google Cloud Platform played a key role in helping us grow without worrying about our architecture. We launched in December 2012 with just three part-time engineers and have never taken funding, so building our own infrastructure was out of the question. Since launching, we’ve grown to over 12 million users with a million monthly active users. Our application now sees millions of interactions daily that run through Google App Engine and Google Cloud Datastore.

As our user base has grown, we’ve begun migrating the biggest pieces of our backend processing architecture from App Engine onto Google Compute Engine. This will allow us to operate at even higher performance levels by using the cheaper CPU and caching more data in the large RAM configurations supported by Compute Engine instances. We also plan to use Google BigQuery soon to make the process of finding the perfect recipe even easier.

Cost was a big concern when we were burning through our savings to build Allthecooks, but it wasn’t as important as making everything run as fast as possible. When a request hits, the response needs to be immediate. That’s why we built our recommendation engine on Compute Engine -- that lets us run entirely on RAM, so users get the results they need as soon as they need them. When you’ve got 60 seconds to sear a fish just right, you can’t be caught waiting on a laggy server request to see the next step in a recipe. We never want our app latency to stand between our users and a great meal.

There are plenty of cooking apps out there, but none with the same level of social interaction as Allthecooks, which is one of the biggest reasons our users spend so much time on the app. We let users ask questions and receive answers on every recipe, and upload pictures of their own dishes. Our recommendation engine in particular plays a pivotal role in making the Allthecooks experience so useful on mobile devices. Google Glass users love that they can find and follow a recipe instructions hands-free or tilt their heads up to see the ingredient list. They can even record their own recipes using voice, photos and video and send it to their Allthecooks account. All of this requires a reliable infrastructure.

We don’t have a system administrator, so we need a system that’s reliable, manageable and scalable while requiring minimal oversight. Cloud Platform gives us confidence that the app is stable, and it can even be left alone for a week without requiring maintenance. Google’s support team has also given us peace of mind. They’re always quick to respond, and they have provided great services the few times we needed help.

It gives us great pride that Allthecooks has helped helped millions of people live healthier, and discover new foods and products they love. We believe in launching early and listening to what our customers want. Cloud Platform is a crucial component to our success to date and strategic for our future growth.

-Posted by Rafael Sanches, co-founder and engineer of Allthecooks

Posted:
If you saw our post about Cassandra hitting 1 million writes per second on Google Compute Engine, then you know we’re getting serious about open source NoSQL. We’re making it easier to run the software you love at the scale you need with the reliability of Google Compute Platform. With over a dozen different virtual machine types, and the great price for performance of persistent disks, we think Google Compute Engine is a fantastic place for Apache Cassandra.

Today, we’re making it even easier to launch a dedicated Apache Cassandra cluster on Google Compute Engine. All it takes is one click after some basic information such as the size of the cluster. In a matter of minutes, you get a complete Cassandra cluster deployed and configured.

Each node is automatically configured for the cloud including:
  • Configured with the GoogleCloudSnitch for Google Cloud Platform awareness
  • Writes tuned for Google Persistent Disk
  • JVM tuned to perform on Google Compute Engine instances

The complete set of tuning parameters can be found on the Click to Deploy help page.

So, get out and click to deploy your Cassandra cluster today!

Learn more about running Apache Cassandra on Google Compute Engine at https://developers.google.com/cloud/cassandra.

-Posted by Brian Lynch, Solutions Architect

Cassandra, is registered trademarks of Apache, Inc. All other trademarks cited here are the property of their respective owners.

Posted:
We’ve had a great time giving you our predictions for the World Cup (check out our post before the quarter-finals and semi-finals). So far, we’ve gotten 13 of 14 games correct. But this isn't about us picking winners in World Cup soccer - it’s about what you can do with Google Cloud Platform. Now, we are open-sourcing our prediction model and packaging it up so you can do your own analysis and predictions.

We used Google Cloud Dataflow to ingest raw, touch-by-touch gameplay day from Opta for thousands of soccer matches. This data goes back to the 2006 World Cup, three years of English Barclays Premier League, two seasons of Spanish La Liga, and two seasons of U.S. MLS. We then polished the raw data into predictive statistics using Google BigQuery.

You can see BigQuery engineer Jordan Tigani (+JordanTigani) and developer advocate Felipe Hoffa (@felipehoffa) talk about how we did it in this video from Google I/O.

Our prediction for the final
It’s a narrow call, but Germany has the edge: our model gives them a 55% chance of defeating Argentina due to a number of factors. Thus far in the tournament, they’ve had better passing in the attacking half of their field, a higher number of shots (64 vs. 61) and a higher number of goals scored (17 vs. 8).

But, 55% is only a small edge. And, although we've been trumpeting our 13 of 14 record, picking winners isn't exactly the same as predicting outcomes. If you'd asked us which scenario was more likely, a 7 to 1 win for Germany against Brazil or a 0 to 1 defeat of Germany by Brazil, we wouldn't have gotten that one quite right.

(Oh, and we think Brazil has a tiny advantage in the third place game. They may have had a disappointing defeat on Tuesday, but the numbers still look good.)

But don’t take our word for it...
Now it’s your turn to take a stab at predicting. We have provided an IPython notebook that shows exactly how we built our model and used it to predict matches. We had to aggregate the data that we used, so you can't compute additional statistics from the raw data. However, for the real data geeks, you could try to see how well neural networks can predict the same data or try advanced techniques like principal components analysis. Alternatively, you can try adding your own features like player salaries or team travel distance. We've only scratched the surface, and there are lots of other approaches you can take.

You might also try simulating how the USA would have done if they had beat Belgium. Or how Germany in 2014 would fare against the unstoppable Spanish team of 2010. Or you could figure out whether the USA team is getting better by simulating the 2006 team against the 2010 and 2014 teams.

Here’s how you can do it
We’ve put everything on GitHub. You’ll find the IPython notebook containing all of the code (using pandas and statsmodels) to build the same machine learning models that we've used to predict the games so far. We've packaged it all up in a Docker container so that you can run your own Google Compute Engine instance to crunch the data. For the most up-to-date step-by-step instructions, check out the readme on GitHub.

-Posted by Benjamin Bechtolsheim, Product Marketing Manager