Around 2002 I attended a private party for Google — before its IPO, when it was a small company focused only on search. I struck up a conversation with Larry Page, Google’s brilliant cofounder. “Larry, I still don’t get it. There are so many search companies. Web search, for free? Where does that get you?” My unimaginative blindness is solid evidence that predicting is hard, especially about the future, but in my defense this was before Google had ramped up its ad auction scheme to generate real income, long before YouTube or any other major acquisitions. I was not the only avid user of its search site you thought it would not last long. But Page’s reply has always stuck with me: “Oh, we’re really making an A.I.”
I’ve thought a lot about that conversation over the past few years as Google has bought 13 other AI and robotics companies in addition to DeepMind. At first glance, you might think that Google is beefing up its AI portfolio to improve its search capabilities, since search constitutes 80 percent of its revenue. But I think that’s backward. Bather than use AI to make its search better, Google is using search to make its AI better. Every tie you type a query, click on a search-generated link, or create a link on the web, you are training the Google AI. When you type “Easter Bunny” into the image search bar and then click on the most Easter Bunny-looking image, you are teaching the AI what an Easter Bunny looks like. Each of the 3 billion queries that Google conducts each day tutors the deep-learning AI over and over again. With another 10 years of steady improvements to its AI algorithms, plus a thousandfold more data and a hundreds more computing resources, Google will have an unrivaled AI. In a quarterly earning conference call in the fall of 2015, Google CEO Sundar Pichai stated that AI was going to be “a core transformative way by which we are rethinking everything we are doing… We are applying it to all of our products, be it search, be it YouTube and Play etc.” My prediction: By 2026, Google’s main product will not be search but AI.
All new digital services from the government must meet the Digital by Default Service Standard.
All public facing transactional services must meet the standard. It’s used by departments and the Government Digital Service to check whether a service is good enough for public use.
1. Understand user needs
Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.
2. Do ongoing user research
Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.
3. Have a multidisciplinary team
Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.
4. Use agile methods
Build your service using the agile, iterative and user-centred methods set out in the manual.
5. Iterate and improve frequently
Build a service that can be iterated and improved on a frequent basis and make sure that you have the capacity, resources and technical flexibility to do so.
6. Evaluate tools and systems
Evaluate what tools and systems will be used to build, host, operate and measure the service, and how to procure them.
7. Understand security and privacy issues
Evaluate what user data and information the digital service will be providing or storing and address the security level, legal responsibilities, privacy issues and risks associated with the service (consulting with experts where appropriate).
8. Make all new source code open
Make all new source code open and reusable, and publish it under appropriate licences (or provide a convincing explanation as to why this can’t be done for specific subsets of the source code).
9. Use open standards and common platforms
Use open standards and common government platforms where available.
10. Test the end-to-end service
Be able to test the end-to-end service in an environment identical to that of the live version, including on all common browsers and devices, and using dummy accounts and a representative sample of users.
11. Make a plan for being offline
Make a plan for the event of the digital service being taken temporarily offline.
12. Create a service that’s simple
Create a service that is simple and intuitive enough that users succeed first time.
13. Make the user experience consistent with GOV.UK
Build a service consistent with the user experience of the rest of GOV.UK including using the design patterns and style guide.
14. Encourage everyone to use the digital service
Encourage all users to use the digital service (with assisted digital support if required) alongside an appropriate plan to phase out non-digital channels and services.
15. Collect performance data
Use tools for analysis that collect performance data. Use this data to analyse the success of the service and to translate this into features and tasks for the next phase of development.
16. Identify performance indicators
Identify performance indicators for the service, including the 4 mandatory key performance indicators (KPIs) defined in the manual. Establish a benchmark for each metric and make a plan to enable improvements.
17. Report performance data on the Performance Platform
Why you should report data and how you’ll be assessed.
18. Test with the minister
Test the service from beginning to end with the minister responsible for it.