Connect with us

Technology

25 Biggest Moments in Search, From Helpful Images to AI

Published

on

Biggest Moments in Search

Here’s how we’ve made Search more helpful over 25 years — and had a little fun along the way, too.

When Google first launched 25 years ago, it was far from the first search engine. But quickly, Google Search became known for our ability to help connect people to the exact information they were looking for, faster than they ever thought possible.

Over the years, we’ve continued to innovate and make Google Search better every day. From creating entirely new ways to search, to helping millions of businesses connect with customers through search listings and ads (starting with a local lobster business advertising via AdWords in 2001), to having some fun with Doodles and easter eggs — it’s been quite a journey.

For our 25th birthday, we’re looking back at some of the milestones that made Google more helpful in the moments that matter, and played a big role in where Google is today. Learn more about our history in our Search Through Time site.

2001: Google Images

When Jennifer Lopez attended the 2000 Grammy Awards, her daring Versace dress became an instant fashion legend — and the most popular query on Google at the time. Back then, search results were just a list of blue links, so people couldn’t easily find the picture they were looking for. This inspired us to create Google Images.

2001: “Did you mean?”

“Did you mean,” with suggested spelling corrections, was one of our first applications of machine learning. Previously, if your search had a misspelling (like “floorescent”), we’d help you find other pages that had the same misspelling, which aren’t usually the best pages on the topic. Over the years we’ve developed new AI-powered techniques to ensure that even if your finger slips on the keyboard, you can find what you need.

2002: Google News

During the tragic events of September 11, 2001, people struggled to find timely information in Search. To meet the need for real-time news, we launched Google News the following year with links to a diverse set of sources for any given story.

2003: Easter eggs

Googlers have developed many clever Easter eggs hidden in Search over the years. In 2003, one of our first Easter eggs gave the answer to life, the universe and everything, and since then millions of people have turned their pages askew, done a barrel roll, enjoyed a funny recursive loop and celebrated moments in pop culture.

One of our earliest Easter eggs is still available on Search.

2004: Autocomplete

Wouldn’t it be nice to type as quickly as you think? Cue Autocomplete: a feature first launched as “Google Suggest” that automatically predicts queries in the search bar as you start typing. Today, on average, Autocomplete reduces typing by 25% and saves an estimated over 200 years of typing time per day.

2004: Local information

People used to rely on traditional phone books for business information. The web paved the way for local discovery, like “pizza in Chicago” or “haircut 75001.” In 2004, Google Local added relevant information to business listings like maps, directions and reviews. In 2011, we added click to call on mobile, making it easy to get in touch with businesses while you’re on the go. On average, local results in Search drive more than 6.5 billion connections for businesses every month, including phone calls, directions, ordering food and making reservations.

2006: Google Translate

Google researchers started developing machine translation technology in 2002 to tackle language barriers online. Four years later, we launched Google Translate with text translations between Arabic and English. Today, Google Translate supports more than 100 languages, with 24 added last year.

2006: Google Trends

Google Trends was built to help us understand trends on Search with aggregated data (and create our annual Year in Search). Today, Google Trends is the world’s largest free dataset of its kind, enabling journalists, researchers, scholars and brands to learn how searches change over time.

2007: Universal Search

Helpful search results should include relevant information across formats, like links, images, videos, and local results. So we redesigned our systems to search all of the content types at once, decide when and where results should blend in, and deliver results in a clear and intuitive way. The result, Universal Search, was our most radical change to Search at the time.

2008: Google Mobile App

With the arrival of Apple’s App Store, we launched our first Google Mobile App on iPhone. Features like Autocomplete and “My Location” made search easier with fewer key presses, and were especially helpful on smaller screens. Today, there’s so much you can do with the Google app — available on both Android and iOS — from getting help with your math homework with Lens to accessing visual translation tools in just a tap.

2008: Voice Search

In 2008, we introduced the ability to search by voice on the Google Mobile App, expanding to desktop in 2011. With Voice Search, people can search by voice with the touch of a button. Today, search by voice is particularly popular in India, where the percentage of Indians doing daily voice queries is nearly twice the global average.

2009: Emergency Hotlines

Following a suggestion from a mother who had a hard time finding poison control information after her daughter swallowed something potentially dangerous, we created a box for the poison control hotline at the top of the search results page. Since this launch, we’ve elevated emergency hotlines for critical moments in need like suicide prevention.

2011: Search by Image

Sometimes, what you’re searching for can be hard to describe with words. So we launched Search by Image so you can upload any picture or image URL, find out what it is and where else that image is on the web. This update paved the way for Lens later on.

2012: Knowledge Graph

We introduced the Knowledge Graph, a vast collection of people, places and things in the world and how they’re related to one another, to make it easier to get quick answers. Knowledge Panels, the first feature powered by the Knowledge Graph, give you a quick snapshot of information about topics like celebrities, cities and sports teams.

2015: Popular Times: We launched the Popular Times feature in Search and Maps to help people see the busiest times of the day when they search for places like restaurants, stores, and museums.

2016: Discover

By launching a personalized feed (now called Discover) we helped people explore content tailored to their interests right in the Google app, without having to search.

2017: Lens

Google Lens turns your camera into a search query by looking at objects in a picture, comparing them to other images, and ranking those other images based on their similarity and relevance to the original picture. Now, you can search what you see in the Google app. Today, Lens sees more than 12 billion visual searches per month.

2018: Flood forecasting

To help people better prepare for impending floods, we created forecasting models that predict when and where devastating floods will occur with AI. We started these efforts in India and today, we’ve expanded flood warnings to 80 countries.

2019: BERT

A big part of what makes Search helpful is our ability to understand language. In 2018, we introduced and open-sourced a neural network-based technique to train our language understanding models: BERT (Bidirectional Encoder Representations from Transformers). BERT makes Search more helpful by better understanding language, meaning it considers the full context of a word. After rigorous testing in 2019, we applied BERT to more than 70 languages.  Learn more about how BERT works to understand your searches.

2020: Shopping Graph

Online shopping became a whole lot easier and more comprehensive when we made it free for any retailer or brand to show their products on Google. We also introduced Shopping Graph, an AI-powered dataset of constantly-updating products, sellers, brands, reviews and local inventory that today consists of 35 billion product listings.

2020: Hum to Search

We launched Hum to Search in the Google app, so you’ll no longer be frustrated when you can’t remember the tune that’s stuck in your head. The machine learning feature identifies potential song matches after you hum, whistle or sing a melody. You can then explore information on the song and artist.

2021: About this result

To help people make more informed decisions about which results will be most useful and reliable for them, we added “About this result” next to most search results. It explains why a result is being shown to you and gives more context about the content and its source, based on best practices from information literacy experts. ‘About this’ result is now available in all languages where Search is available.

2022: Multisearch

To help you uncover the information you’re looking for — no matter how tricky — we created an entirely new way to search with text and images simultaneously through Multisearch. Now you can snap a photo of your dining set and add the query “coffee table” to find a matching table. First launched in the U.S., Multisearch is now available globally on mobile, in all languages and countries where Lens is available.

2023: Search Labs & Search Generative Experience (SGE)

Every year in Search, we do hundreds of thousands of experiments to figure out how to make Google more helpful for you. With Search Labs, you can test early-stage experiments and share feedback directly with the teams working on them. The first experiment, SGE, brings the power of generative AI directly into Search. You can get the gist of a topic with AI-powered overviews, pointers to explore more and natural ways to ask follow ups. Since launching in the U.S., we’ve rapidly added new capabilities, with more to come.

As someone who’s been following the world of search engines for more than two decades, it’s amazing to reflect on where Google started — and how far we’ve come.

Technology

Zoho Unveils New AI Assistant for Zoho Creator

Published

on

zoho creator

By Aduragbemi Omiyale

To facilitate faster, simpler, and more intelligent app building, Zoho Corporation has launched new services and features within its low-code application development platform, Zoho Creator.

The new Artificial Intelligence (AI) assistant, CoCreator, can be used to build applications by using voice and written prompts, process flows and business specification documents.

In a statement to Business Post on Monday, the global technology company said this milestone reflects its commitment to investing in AI capabilities that offer real-time, practical, and secure advantages to business users.

Powered by Zia, CoCreator drives shorter go-to-market timeframes and democratises app creation for users of varying  skill levels—all without requiring add-ons to a customer’s existing subscription.

Zia has served as a bridge across Zoho’s entire product suite, including Creator, since its launch in 2015.

As AI becomes increasingly central to business operations, Zoho’s complete ownership of its tech stack and deep AI integration provides customers with a higher level of contextual AI across all company workflows compared to competitors. This empowers users with a system that truly understands their data and anticipates its usage.

Among the newly-launched capabilities is the Idea-to-App Generation feature, allowing businesses to utilise ZohoAI or OpenAI to develop full-fledged applications including contextual integrations, automations, permission sets and insightful dashboards.

By using text or voice prompts, process flow diagrams, or systems documentations like software requirement specifications (SRS), Creator will provide domain-specific suggestions, ideas for relevant fields, and modules tailored to a customer’s business

Contextual component generation AI enhances existing applications by offering prompt-based form generation. Zia also proactively suggests contextual fields within forms, a functionality missing from many low-code development platforms.

Developers of all skill levels can generate and optimise code blocks contextually within apps using Zia’s prompter, and also annotate existing code blocks for future maintenance.

Further advancing business capabilities, users can rapidly transform unstructured data from various file types and databases into custom applications and remove inconsistencies using the AI-driven data cleansing and modelling feature.

Additionally, the newly-introduced AI Skills enables businesses to build apps with specialised skills that interpret natural language instructions in the business context and automate complex chains of actions intelligently. This feature is currently available in early access and will be widely available from June 2025.

“Since we introduced Creator in 2006, our mission has been to make app development simpler and faster, without compromising on functionality.

“AI now takes us to the next level, shortening the time from an idea to an app.

“Today’s announcement significantly raises the baseline on speed of quality app creation with deep capabilities, without adding costs,” the Country Head for Zoho Nigeria, Mr Kehinde Ogundare, stated.

Continue Reading

Technology

The Unsung Heroes of Fintech: How Creatives Are Driving Growth and Trust in the Financial Industry

Published

on

Unsung Heroes of Fintech samuel olaniran

By Samuel Olaniran

Many experts have highlighted the growing impact of creatives—especially those in product and brand design—across the financial industry, and how their work helps financial companies build trust, communicate value propositions, and drive growth.

These creatives shape the overall product and visual identity of financial brands, creating not just logos, colour schemes, and layouts, but also cohesive design systems that convey professionalism and reliability. This is crucial because trust is vital in finance. A strong, consistent brand and product design helps customers feel secure and confident in their financial decisions.

In digital platforms, product designers improve user experience. They ensure mobile apps, websites, and other tools are not only visually appealing but also functional and easy to navigate. A smooth, intuitive interface encourages users to engage more, making digital banking and investing more accessible to a wider audience. This can drive growth, as people are more likely to trust and stick with platforms that are easy to use.

Brand and product designers also simplify complex financial data through infographics and visualizations. Finance can be overwhelming, but clear visuals and product-led storytelling make it easier for customers to understand. Infographics turn complicated reports into digestible, engaging content, which can help customers make better financial decisions.

Marketing in finance also relies heavily on thoughtful brand design. Designers create visually appealing campaigns that catch the attention of potential customers. Whether it’s an ad on social media or an email newsletter, well-crafted design helps companies stand out and build a strong online presence.

In a competitive industry like fintech, where innovation is key, product and brand design can be the difference between success and failure.

As financial institutions grow globally, product designers help adapt their offerings and messaging to different cultures. By adjusting colours, symbols, and user interface elements to fit local preferences, they ensure financial products are accessible to a wider audience. This helps companies expand into new markets while keeping their brand relevant and consistent.

Looking ahead, the role of product and brand designers will only become more important. Their creative work is key to building trust, improving user experience, simplifying data, and leading marketing efforts. As finance continues to evolve, their role will remain essential in helping companies grow and connect with customers.

Continue Reading

Technology

Tribunal Orders Meta, WhatsApp to Pay FCCPC’s $220m Fine in 60 Days

Published

on

WhatsApp Self Messaging Feature

By Adedapo Adesanya

Nigeria’s Competition and Consumer Protection Tribunal on Friday ordered WhatsApp and Meta Platforms Incorporated to pay a $220 million penalty and $35,000 to the Federal Competition and Consumer Protection Commission (FCCPC) within 60 days over data discrimination practices in Nigeria.

The tribunal upheld the $220 million penalty imposed by the FCCPC on WhatsApp and Meta Platforms Incorporated, as well as $35,000 as reimbursement for the commission’s investigation against the social media giant.

The tribunal also dismissed the appeal by WhatsApp and Meta Platforms Incorporated regarding the $220 million penalty imposed by the FCCPC for alleged discriminatory practices in Nigeria.

The tribunal’s three-member panel, led by Mr Thomas Okosun, passed the verdict on Friday.

WhatsApp and Meta’s legal team, led by Mr Gbolahan Elias (SAN), and the FCCPC’s legal team, represented by Mr Babatunde Irukera (SAN), a former Executive Vice Chairman of the agency, made their final arguments on behalf of their respective clients on January 28, 2025.

Last year, the FCCPC asked Meta, the parent company of WhatsApp, Facebook, and Instagram, to pay $220 million for an alleged data privacy breach.

According to the agency, Meta was found culpable of denying Nigerians the right to self-determine, unauthorised transfer and sharing of Nigerians data, discrimination and disparate treatment, abuse of dominance, and tying and bundling.

The FCCPC noted that its decision was reached after a 38-month joint investigation by it and the Nigeria Data Protection Commission (NDPC).

The regulator also noted that its actions were based on legitimate consumer protection and data privacy concerns. It highlighted that its final order requires Meta to comply with Nigerian consumers and meet local standards.

“Similar measures are taken in other jurisdictions without forcing companies to leave the market. The case of Nigeria will not be different,” the FCCPC added.

Also weighing in on the issue then, Mr Irukera, noted on X that the approach being taken by the platform varied from that it was applying in other places it was operating.

“The same company just settled a Texas case for $1.4 billion and is currently facing regulatory action in at least a dozen nations, appealing large penalties in several countries. How many has it threatened to exit?” he queried.

Continue Reading

Trending

https://businesspost.ng/DUIp2Az43VRhqKxaI0p7hxIKiEDGcGdois8KSOLd.html