The Role of Cloud Computing in Digital Transformation


Digital transformation is the process of using digital technologies to transform traditional business operations and processes. Cloud computing plays a crucial role in digital transformation by enabling businesses to adopt new technologies, innovate faster, and increase their agility. In this article, we will explore the role of cloud computing in digital transformation.

Enabling the Adoption of New Technologies

Cloud computing provides a platform for businesses to adopt new technologies without the need for significant capital expenditure. By leveraging cloud services, businesses can quickly and easily access the latest software, applications, and infrastructure, allowing them to innovate and stay competitive.

Improving Business Agility

Cloud computing enables businesses to become more agile by providing flexible and scalable infrastructure. This allows businesses to rapidly respond to changes in the market, customer needs, and emerging technologies. Cloud computing also allows businesses to test and deploy new applications and services quickly, reducing time-to-market and increasing business agility.

Enhancing Collaboration

Cloud computing enables remote collaboration and improves communication between team members, regardless of their location. This allows businesses to collaborate and share data and insights in real-time, improving decision-making and accelerating innovation.

Reducing Costs

Cloud computing can help businesses reduce costs by eliminating the need for on-premises infrastructure and reducing IT staffing requirements. By leveraging cloud services, businesses can reduce capital expenditure and operational costs while still benefiting from the latest technologies and infrastructure.

Enhancing Security

Cloud computing provides robust security features and protocols, enabling businesses to protect their data and applications from cyber threats. Cloud providers often have teams of security experts dedicated to ensuring the safety and security of their clients’ data.

In conclusion, cloud computing plays a critical role in digital transformation by enabling businesses to adopt new technologies, increase agility, enhance collaboration, reduce costs, and improve security. As businesses continue to navigate the rapidly changing digital landscape, cloud computing will remain a key enabler of digital transformation, helping businesses to innovate and stay competitive.

The Economics of Cloud Computing: How it Can Help You Save Money


Cloud computing has revolutionized the way businesses operate, offering cost-effective solutions that can help them save money. In this article, we will explore the economics of cloud computing and how it can help businesses save money.

Reduced Capital Expenditure

One of the primary benefits of cloud computing is the reduction in capital expenditure (CapEx). With traditional IT infrastructure, businesses need to purchase and maintain hardware, software, and networking equipment, which can be expensive. Cloud computing eliminates the need for businesses to invest in their own hardware and software, allowing them to reduce their CapEx.

Lower Operational Costs

In addition to reducing CapEx, cloud computing also helps businesses lower their operational costs. With cloud computing, businesses only pay for the services they use, allowing them to reduce operational expenses. For example, instead of purchasing and maintaining their own servers, businesses can use cloud services to store their data, which can be more cost-effective.


Cloud computing allows businesses to scale their operations up or down based on their needs. This means businesses can avoid the costs associated with over-provisioning or under-provisioning resources. With cloud computing, businesses can quickly add or remove resources as needed, allowing them to save money and improve efficiency.

Reduced IT Staffing Costs

Managing an on-premises IT infrastructure requires skilled IT staff, which can be expensive. Cloud computing eliminates the need for businesses to hire their own IT staff, as cloud providers handle the maintenance and management of the infrastructure. This can result in significant cost savings for businesses.

Increased Productivity

Cloud computing also allows businesses to increase their productivity, which can result in cost savings. With cloud services, employees can access data and applications from anywhere, at any time, using any device with an internet connection. This allows employees to work remotely and collaborate easily, resulting in increased productivity and reduced downtime.

In conclusion, cloud computing offers several economic benefits that can help businesses save money. By reducing CapEx, lowering operational costs, providing scalability, reducing IT staffing costs, and increasing productivity, cloud computing can help businesses optimize their operations, improve efficiency, and gain a competitive advantage. If you haven’t already considered cloud computing for your business, now is the time to do so.

Cloud Computing vs. Traditional IT Infrastructure: Which is Right for Your Business?


When it comes to managing an IT infrastructure, businesses have two options: cloud computing or traditional on-premises infrastructure. While both have their advantages and disadvantages, choosing the right solution for your business depends on your specific needs and goals. In this article, we will compare cloud computing and traditional IT infrastructure to help you make an informed decision.

Cloud Computing

Cloud computing refers to the delivery of computing services, including servers, storage, databases, software, and analytics, over the internet. Instead of maintaining an on-premises infrastructure, businesses can use cloud services to access resources from anywhere, at any time, using any device with an internet connection. Cloud computing offers several advantages, including:

Cost Savings: Cloud computing eliminates the need for businesses to purchase and maintain hardware and software, reducing upfront costs.

Scalability: Cloud services can be scaled up or down based on business needs, allowing businesses to pay only for what they use.

Flexibility: Cloud services can be accessed from anywhere, allowing employees to work remotely and collaborate easily.

Security: Cloud providers typically invest in advanced security measures to protect their customers’ data.

Traditional IT Infrastructure

Traditional IT infrastructure involves setting up servers, storage devices, and networking equipment on-premises, which can be expensive and time-consuming to manage. However, it offers some benefits, including:

Control: With on-premises infrastructure, businesses have complete control over their data and infrastructure, which may be important for regulatory compliance or security reasons.

Customization: On-premises infrastructure can be customized to meet specific business needs.

Performance: With on-premises infrastructure, businesses have direct access to their resources, which can result in faster performance and reduced latency.

Integration: On-premises infrastructure can be easily integrated with other systems and applications, making it easier to manage workflows and processes.

Which is Right for Your Business?

Choosing the right infrastructure depends on your business needs and goals. If you have limited resources, want to reduce upfront costs, and need flexibility, cloud computing may be the best option for you. On the other hand, if you need complete control over your data, have specialized requirements, and need faster performance, on-premises infrastructure may be the better choice.

It’s important to note that many businesses choose a hybrid approach, combining cloud and on-premises infrastructure to take advantage of the benefits of both. This allows businesses to keep sensitive data on-premises while utilizing cloud services for scalability, flexibility, and cost savings.

In conclusion, both cloud computing and traditional IT infrastructure have their advantages and disadvantages. To make the right decision for your business, it’s important to consider your specific needs and goals, including budget, scalability, control, performance, and security. By choosing the right infrastructure, businesses can optimize their operations, improve efficiency, and gain a competitive advantage.

Vertex AI Streaming Ingestion allows for real-time AI



Vertex AI Streaming Ingestion allows for real-time AI
Near-real-time predictions are required for many machine learning (ML), use cases such as fraud detection, ad targeting, recommendation engines,  and other areas. These predictions are dependent on having access to the latest data. Even a few seconds delay can make a big difference in performance. It’s not easy to create the infrastructure necessary to support high-throughput updates as well as low-latency retrieval.

Vertex AI Search Engine, and Feature Library will now support real-time streaming ingestion. Streaming Ingestion is a fully managed vector database that allows for vector similarity searches. Items in an index are continuously updated and reflected immediately in similarity search results. Streaming Ingestion can be used to retrieve the most recent feature values and extract real-time data for training.

Digits has taken advantage of Vertex Ai Matching Engine Streaming Ingestion in order to power their product, Boost. This tool saves accountants’ time and automates manual quality control. Digits Boost is now able to provide analysis and features in real-time thanks to Vertex AI Matching Engine Streaming Ingestion. Prior to Matching Engine transactions were classified according to a 24-hour batch schedule.

However, with Matching Engine streaming ingestion, we are able perform near-real time incremental indexing activities such as inserting, updating, or deleting embedded embeddings on existing indexes. This has helped us speed up our process. “Now we can provide immediate feedback to our customers and handle more transactions more quickly,” stated Hannes Hapke (machine learning engineer at Digits).

This blog post explains how these new features improve predictions and allow near-real-time use cases such as recommendations, content personalization and cybersecurity monitoring.

Streaming Ingestion enables real-time AI

Organizations are realizing the potential business benefits of predictive models that use up-to-date information, and more AI applications are being developed. Here are some examples.

Real time recommendations and a marketplace. Mercari has added Streaming Ingestion to their existing Matching Engine product recommendations. This creates a real-time market where users can search products based upon their interests and are updated immediately when new products are added. It will feel like shopping at a farmer’s market in the morning. Fresh food is brought in while you shop.

Mercari’s Matching Engine filtering capabilities and Streaming Ingestion can be combined to determine whether an item should appear in search results. This is based on tags like “online/offline” and “instock/nostock”.
Large-scale personalized streaming: You can create pub-sub channels for any stream of content that is represented with feature vectors. This allows you to select the most valuable content according to each subscriber’s interests.

Matching Engine’s scalability (i.e. it can process millions upon queries per second) means that you can support millions online subscribers to content streaming. You can also serve a wide range of topics that change dynamically because it is highly scalable. Matching Engine’s filtering capabilities allow you to control what content is included by assigning tags like “explicit”, “spam” and other attributes to each object.

Feature Store can be used as a central repository to store and serve the feature vectors of your contents in close real-time.

Monitoring – Content streaming can be used to monitor events and signals from IT infrastructures, IoT devices or manufacturing production lines. You can, for example, extract signals from millions sensors and devices and turn them into feature vectors.

Matching Engine allows you to update in near real-time a list “top 100 devices with defective signals” or “top 100 sensor events that have outliers”

Spam detection: Matching Engine can instantly identify potential attacks from millions upon millions of monitoring points if you’re looking for security threat signatures and spam activity patterns. Security threat identification that relies on batch processing can have significant delays, making the company more vulnerable. Your models can detect threats and spams more quickly with real-time data.

Implementing streaming use cases

Let’s look closer at some of these use cases.

Retailers get real-time advice
Mercari created a feature extraction pipeline using Streaming Ingestion.

To initiate the process, the feature extraction pipeline is called Vertex AIP Pipelines. It is periodically invoked by Cloud Scheduler or Cloud Functions.

Get item information: The pipeline issues an query to retrieve the latest item data from BigQuery.

Extract feature vector The pipeline makes predictions on the data using the word2vec modeling to extract feature vectors.

Update index The pipeline calls Matching engine APIs to to add feature vectors to the Vector Index. Also, the vectors can be saved to Bigtable (and may be replaced by Feature Store in future).

“We were pleasantly surprised by the extremely short latency for index updates when we tested the Matching Engine Streaming Ingestion. Nogami Wakana (a software engineer at Souzoh, a Mercari-group company) stated that they would like to add the functionality to their production service as soon it becomes GA.

This architecture design is also applicable to retail businesses that require real-time product recommendations.

Ad targeting

Real-time features, item matching and the latest information are key to ad recommender systems. Let’s look at how VertexAI can help you build an ad targeting system.

First, generate a list of candidates from the advertisement corpus. This is difficult because you need to generate relevant candidates in milliseconds. Vertex AI Matching engine can be used to generate relevant candidates and perform low-latency vector similarity matches. Streaming Ingestion is also available to keep your index up-to-date.

The next step is to rerank the candidate selection with a machine-learning model in order to ensure you have the right order of ad applicants. To ensure that the model uses the most recent data, you can use Feature Store Streaming ingestion to import the most recent features and use online to serve feature values at low latency to improve precision.

Final optimizations can be applied after reranking ads candidates. You can implement the optimization step using a

GKE Networking Basics – Understanding the basics of networking


This article will discuss the networking components of Google Kubernetes engine (GKE). We’ll also explore the many options available. Kubernetes, an open-source platform for managing containerized workloads or services, and GKE, a fully managed environment to run Kubernetes on Google Cloud infrastructure.

IP address

Kubernetes uses IP addresses and ports for communication between various network components. IP addresses are unique addresses that identify different components of the network.


  • Containers – These components are essential for the execution of application processes. A pod can contain one or more containers.
  • Pods – A group of containers that can be physically grouped together. Nodes are assigned to pods.
  • Nodes – Nodes can be described as worker machines within a cluster, which is a collection of nodes. A node runs zero or more pods.


  • ClusterIP – These addresses can be assigned to a particular service.
  • Load Balancer – This balances the internal and external traffic to cluster nodes.
  • Ingress – Loadbalancer that handles HTTP(S).

The IP addresses are assigned by subnets to components and services. Variable length subnet masks (VLSM) are used to create CIDR blocks. The subnet mask determines how many hosts are available on a subnet.

Google Cloud uses 2 n– 4 for the calculation of available hosts. This is not the formula used for on-premise networks.

This is how the flow of IP address assignments looks:

  • Cluster’s VPC network assigns IP addresses to nodes
  • The Node IPv4 block automatically assigns internal load balancer IP addresses. You can specify a range to your Loadbalancers, and then use the loadBalancerIP option for that address.
  • The addresses assigned to pods come from a list of addresses that have been issued to the pods on that particular node. 110 pods are the default maximum allowed per node. This number is multiplied with 2 to allocate an address. (110*2=220). The nearest subnet is then used. /24 is the default. This creates a buffer to allow for scheduling the pods. This limit can be set at creation.
  • Containers share the IP addresses of the Pods that they are running on.
  • The address pool that is reserved for services includes service (Cluster IP), addresses.

You can see an example of how to plan and scope address ranges in the IP addresses for VPC native clusters section.

Domain Naming System

DNS allows for name to IP address resolution. This allows services to automatically create name entries. GKE offers several options.

  • kube-dns – Kubernetes native add-on service. Kube-dns can only be run on a deployment which is accessible via a cluster IP. This service is used by default for DNS queries by pods within a cluster. This document explains how it works.
  • Cloud DNS – This is Google Cloud DNS managed services. This service can be used for managing your cluster DNS. Cloud DNS has some advantages over kube DNS:
    • This reduces the administration of a cluster-hosted DNS Server.
    • Local resolution of DNS for GKE nodes. This is achieved by caching local responses, which allows for both speed and scaleability.
    • Integrates with Google cloud Operations monitoring suite.

Service Directory can also be integrated with GKE or Cloud DNS to manage services through namespaces.

The gke-networking-recipes github repo has some Service Directory examples you can try out for Internal LoadBalancers, ClusterIP, Headless & NodePort.

You can learn more about DNS options in GKE by reading the article DNS on GKE: Everything that you need to know.

Load Balancing

These devices control access and distribute traffic over clutter resources. GKE offers several options:

  • Internal Load Balancers
  • External load balancers


They handle HTTP(S), traffic to your cluster. They use the Ingress resource type. This creates an HTTP(S), load balancer for GKE. To ensure that the address does not change, you can assign a static address to the loadbalancer when configuring.

GKE allows you to provision both internal and external Ingress. These guides will show you how to configure GKE.

  • Ingress configuration for internal HTTP(S), load balancing
  • External load balancing

GKE allows you container native load balancing that directs traffic directly towards the pod IP via Network Endpoint groups (NEGs).

Service routing

These are the main points you need to know about this topic.

  • Frontend This exposes your service to clients via a frontend that allows traffic to be accepted based on different rules. This could be either a static IP address or a DNS name.
  • Load Balancing Once traffic has been allowed, the load balancer allocates resources to the requested request according to rules.
  • Backend There are many endpoints that can also be used in GKE.


GKE offers many ways to design your clusters network.

  • Standard This mode gives the administrator the ability to manage the clusters’ underlying infrastructure. If you require greater control and responsibility, this mode is for you.
  • Autopilot GKE provides and manages the cluster’s underlying infrastructure. This configuration is ready for use and allows you to have some hand-off management.
  • Private cluster This allows only internal IP connections. Clients need to be able to access the internet (e.g. Cloud NAT is a way to provide clients with access to the internet (e.g. for updates).
  • Private Access (Lets your VPC connect with service producer via private Ip addresses. Private-Service Connect, Allows private consumption across VPC networks.

Bringing everything together

Here is a brief, high-level overview.

  • Your cluster assigns IP addresses to different resources
    • Nodes
    • Pods
    • Containers
    • Services
  • These IP addresses are reserved for various resource types. Subnetting allows you to adjust the size of the range to suit your needs. It is a good idea to limit external access to your cluster.
  • By default, pods can communicate with each other across the cluster.
  • A service is required to expose pod-running applications.
  • Services are assigned cluster IPs.
  • You can either use kube-dns for DNS resolution or Google Cloud DNS within your GKE Cluster.
  • External and internal load balancers can both be used with your cluster to distribute traffic and expose applications.
  • Ingress handles HTTP(S). This uses the HTTP(S) loadbalancing service of Google cloud. Ingress can be used to create internal or external configurations.

Google Cloud enables smarter and greener energy use


Energy bills are a rising expense and consumers are currently facing difficult times. The climate crisis isn’t over. But, sustainability is still a top priority for consumers and businesses. 40% of UK’s emissions are from homes, which include electricity, heating, transport, and other energy-related activities. People often don’t have the time or resources necessary to research and test many ways to save energy while simultaneously trying to meet multiple demands. Kaluza has made it our mission, to help people save money while reducing their household emissions.

Born out of OVO Energy back in 2019, Kaluza is a software-as-a-service company that helps to accelerate the shift to a zero carbon world. Our Kaluza Energy Retail product allows energy companies to put their customers in the center of this transformation by giving them real-time insight that can help lower their bills. Kaluza Flex’s advanced algorithms allow you to charge millions of smart devices at the most affordable and sustainable price. Kaluza partners with some of the largest energy and OEM companies in the world, including AGL in Australia, Fiat, Nissan and Chubu in Japan.

Use Google Cloud data to help 2030 carbon negative goal

We want to stop the production of 10,000,000 tons of CO2 by 2030. This will be achieved by reaching 100,000,000 energy users and reducing 50% of our energy retail clients’ costs to serve. That’s just half. We want to dramatically reduce our emissions as we accelerate the energy transition for customers. We are committed to being carbon neutral by 2030, even as the world rushes towards net zero.

However, we cannot reduce what we don’t measure. To track the effect of cloud usage, we have created an internal carbon footprint tool. The technology stack of our company spans multiple cloud estates, making it easy to obtain emissions data from Google Cloud apps – thanks to the carbon footprint solution.

We get half-hourly information about our electricity usage for every process that we run on Google Cloud. This allows us to pinpoint the carbon emissions of each process we run on Google Cloud. These insights helped us create Kaluza’s carbon footprint tool. We use this information to combine data from all our cloud providers and create more effective dashboards which have been invaluable for our data team.

Green Development: Reduce emissions by 97%

Our carbon emissions tool allows us to get down to the details of data. This allows them to identify what is driving their carbon footprint and how they can address it. This is where the fun begins, as better data can translate into real sustainability projects. We have launched two large-scale initiatives so far.

Green Software Development is the first. A Green Development Handbook has been created. It contains best practices and guides that software engineers and developers can use to make their software more sustainable. We were able to combine a number large BigQuery questions into one query at a more convenient time and place, which resulted in a 97% decrease in emissions. This means that we have reduced the amount CO2 by 6kg every time we run this query. This is just one of the many ways we are making a difference.

Cloud infrastructure efficiency can be improved

Our second major initiative is about our cloud infrastructure. One of the most efficient and effective ways to reduce carbon emissions is to choose a cleaner cloud or a cleaner region in which to run workloads. Google Cloud provides carbon data for all regions. This data includes the hourly average carbon-free energy consumption in the location and the grid carbon intensity for the local electricity grid.

We can find cloud waste by digging into data and take corrective action. While many of our workloads must run continuously throughout the day, they don’t all have to run at specific times. This opens up the possibility of optimization. To understand the state and performance of our workloads, we are using data from Google Cloud. Combining this information with the grid’s carbon intensity data, we can identify and reschedule workloads at lower intensity times and have a positive effect on Kaluza’s emissions.

Data to empower people to make an impact

One thing unites many of our sustainability projects: They are bottom-up initiatives that were developed with and by our team. We have emissions data at our disposal so we organize hackathons and Green Development days to encourage action and test new ideas.

Our core mission is to make sustainability accessible and actionable for everyone. We’re now bringing the same idea to our teams. It has been encouraging to hear the feedback. One of our employees stated that he now understands the impact his role has on Kaluza’s sustainability and the future of the planet. Our company is putting sustainability at its core by giving our employees the ability to take climate action. We can also encourage our employees to create stronger solutions for carbon savings by showing them the direct effects of their work.

Making electric cars more sustainable by becoming green power stations

Kaluza offers many opportunities to make a positive impact. One of our sustainability pillars is our internal pledge to reduce carbon emissions and pass these savings onto our energy retail clients. Google Cloud solutions are also being used for other exciting projects such as the first and largest domestic vehicle to grid (V2G), technology deployment that OVO Energy and Nissan is leading.

With V2G, drivers are able to charge their electric cars when there is plenty of renewable energy and then sell it back the grid when there’s not enough. We’re turning millions of cars into batteries by analyzing the grid and vehicle data with Google Cloud. This will help drivers make hundreds of pounds per year, while also making the system more sustainable. This could help reduce peak grid stress by up to 40% in a market like California.

Together, we can power the future of energy

Kaluza uses technology to simplify the energy transition for clients and customers, from homes to cars and everywhere in between. We are excited to continue working with Google Cloud to grow our business and provide new energy solutions. We are determined to be a leader in sustainability and have found a cloud vendor that shares our sustainability goals. We are building a world where net Zero is within everyone’s reach.

How CISOs must adapt their mental models to cloud security

Security leaders often go into the cloud with a lot of tools, practices, and skills. They also have mental models that are based on the premise. This can lead to efficiency and cost problems. It is possible to map their mental models to the cloud.

It is helpful to look at the types of threats that each cybersecurity model is trying to detect or block when trying to understand the differences between cloud and on-premises models.

On-premise threats were traditionally focused on data theft from corporate databases and file storage. These resources are best protected with layers of network, endpoint and sometimes application security controls. The corporate data crown jewels, or “crown jewels”, were not accessible via an API to the outside world. They were stored in publicly-accessible storage buckets. Other threats were also created to disrupt operations and deploy malware for different purposes. These could include outright theft or holding ransom data.

Some threats are specific to the cloud. Bad actors will always try to exploit the cloud’s ubiquitous nature. They scan IP addresses for open storage buckets and internet-exposed compute resources.

Gartner explains that cloud security requires major changes in strategy compared to how we protect on-prem data centres. To protect critical cloud deployments, processes, tools, and architectures must be developed using cloud-native methods. It is important to understand the security responsibilities of your cloud service provider and your company when you start cloud adoption. This will make you less vulnerable to attacks on cloud resources.

Cloud security transformations are a great way to better prepare CISOs for today’s threats, tomorrow and beyond. But they require more than a plan and a few projects. Cybersecurity team leaders and CISOs need to create new mental models to think about security. This will require you to translate your existing security knowledge into cloud realities.

To set the stage for this discussion, let’s define what “cloud native” is. Cloud native architecture is one that makes the most of the flexibility, distributed, scalable and flexible nature of public clouds. Although the term implies that one must be born in the cloud, we are not trying to be exclusive. A better term might be “cloud-focused” which means doing security “the cloudy’ way.

However we define it, adopting cloud is a way to maximize your focus on writing code, creating business value, and keeping your customers happy while taking advantage of cloud-native inherent properties–including security. It is possible to transfer legacy errors, which predate cloud by decades, into future cloud environments by simply lifting-and-shifting your existing security tools and practices to the cloud.

Cloud-native refers to removing layers of infrastructure such as network servers, security appliances, and operating systems. It is about modern tools that are cloud-native and designed for cloud computing. Another way to look at it is that you won’t have to worry about these things as you build code to make your life easier. This is the key to success. Security will follow the DevOps and SRE revolutions in IT.

This thinking can be extended to cloud native security. In this scenario, some of your existing tools are combined with solutions offered by cloud service providers. You can take advantage of cloud-native architecture to protect what’s built in the cloud. We’ve already discussed the differences between targeted threats on-prem and those targeting cloud infrastructure. Here are some other important areas that you should reevaluate when considering a cloud security mental model.

Network security

Some companies treat the cloud like a rented data centre for network security. Many of the traditional methods that worked well for decades on-premise are not suitable for cloud computing.

Concepts like a Demilitarized Zone (DMZ), can be adapted for today’s cloud environments. A modern approach to DMZ could use microsegmentation to control access for identity within context. You have strong control by ensuring that the right identity has access to the right resource in the right context. Even if you make a mistake, microsegmentation is able to limit the breach blast radius.

Cloud native organizations also encourage the use of new approaches to enterprise network security such as BeyondProd. Organizations also benefit from it because they can focus on who and what has access to your services, rather than where the requests originated.

Cloud adoption can have a profound impact on network security, but not all areas will change in the same manner.

Endpoint security

The concept of security endpoints changes in the cloud. It’s like a virtual server. What about containers? What about microservices? Software as a Service cloud model doesn’t have an end point. Users only need to be aware of what happens where along the cloud security path.

This mental model can be helpful: An API can be thought of as a type of endpoint. Cloud APIs can also benefit from some of the security thinking that was developed for endpoints. While the concepts of access security, permissions and privileged access can be transferred, they cannot be used for maintenance of an endpoint operating system.

Insecure agents can pose a risk to their clients even if they have been automated to work on virtual machines in a cloud environment. Example: The Microsoft Azure cross-tenant vulnerability highlighted an entirely new type of risk. It was not even known to many customers.

This is why, among the many endpoint security options, some vanish (such patching operating system for SaaS or PaaS), others survive (such the need to secure privilege access), and still others are transformed.

Response and detection

A move to the cloud will bring changes in the threat landscape and the way you respond to them. It is possible to use on-prem detection technology and methods as a foundation for future developments. It won’t help reduce risk in the way most cloud-first companies will require.

The cloud offers the chance to rethink your security goals, including availability, reliability, confidentiality, integrity, and integrity.

Cloud is distributed, immutable, API-driven and automatically scalable. It also focuses on the identity layer. There are often ephemeral workloads that were created for a specific task. These factors all impact how you manage cloud threat detection and require new detection methods.

Six domains are the best for detecting cloud threats: API, managed services and network. These cover network, identity and compute as well as container infrastructure. These devices also have specific detection mechanisms that allow for API access logs, network traffic captures, and API access logs.

Some approaches are less important than others (e.g. network IDS on encrypted connections), while others can increase in importance (such detecting access anomalies), and others transform (such detecting threats from backplane providers).

Data security

The cloud is changing the way we think about data security.

Cloud adoption will put you on the path to what Google calls “autonomic security .” This means that security has been integrated into all aspects of data lifecycles and is continuously improving. It makes it easier for users to use the cloud, removing them from having a multitude of rules about who, what, when and with which data. It allows you to keep up with ever-changing cyberthreats, business changes, and makes it easier for you to make business decisions quicker.

Like other categories, certain data security methods lose their importance or disappear. For example, manual data classification at the cloud scale. However, some approaches to data security remain important from on-prem and cloud, while others transform (e.g. pervasive encryption with secure key management).

Management of access and identity

Your cloud data center is not the same environment for access and identity management (IAM). Every person and every service in the cloud has their own identity. You want to be able control access.

IAM allows you to centrally manage cloud resources with fine-grained access control. Administrators can give you permission to access specific resources. This gives you complete control over and visibility to centrally manage your cloud resources. IAM provides a single view of security policy across all your organization, regardless of whether you have complex organizational structures or hundreds of workgroups and multiple projects.

You can grant cloud access at fine-grained levels with access management tools. This is far beyond the project-level. You can also create access control policies for resources that are more specific based on attributes such as device security status, IP address and resource type. These policies ensure that appropriate security controls are in effect when accessing cloud resources.

This is where Zero trust plays a strong role. Implicit trust in any one component of a complex interconnected system can pose significant security risks. Trust must be established through multiple mechanisms and continually verified. A zero trust security framework is required to protect cloud-native environments. All users must be authenticated, authorized and validated for security configurations and postures before they are granted access to cloud-based apps and data.

This means that IAM mental model from on-premise security generally survives, but many underlying technologies change dramatically and IAM’s importance in security increases significantly.

Cloud security: Shared destiny for more trust

Cloud is more than just “someone else’s computer.” Trust is a crucial component of your relationship to cloud service providers. Cloud service providers often acknowledge that you share responsibility. This means they provide the infrastructure, but you are responsible for many seemingly complex security tasks.

Google Cloud operates in a shared fate model to manage risk with our customers. It is our responsibility to ensure that our customers are able to deploy securely on the platform. We don’t want to be delineators as to where our responsibility ends. We are there to help you with the best practices for safe migrations to trusted clouds and operation.

As a student, you can improve your data analysis skills


You may be a college student and you are preparing to enter the “big boys” job market.

Data analysis is the area that I always return to when I think of high-value areas to focus my technical skills in.

Data analysis is important.

In today’s technology-driven society data in all its forms is increasingly valuable because of the insights it provides. All fields are seeing an exponential increase in the amount of data generated. This is good news for students. You can now learn data analysis to enhance your existing skills in any field, including marketing, computer science, and even music. No matter your background, having the ability to manipulate, process and analyze data will help you get ahead.

What should I consider when learning new skills

It can be daunting to learn new skills and tools in tech, on top of your coursework, jobs, or internships. Trust me, it’s not easy. It’s important that students are strategic and efficient in determining the best resources for learning.

When I learn new software or skills, there are some factors that I consider.

  • What is the estimated cost of this?
  • What time will this take?
  • What relevance does this have to my job prospects and career?

This is something I don’t even pretend to be thinking about. It is essential to know how to manage your finances, especially when you are looking to improve your career.

What about time? That is also a cost. Students value time as much as money. Students have to balance coursework, studying, work, family, extracurriculars, career growth, and sometimes even a job. We are looking for skills that can be learned quickly and that can be done on our own, in our own time.

Finally, I want to be capable of learning a skill or tool that is relevant to my job search. This will allow me to list it on my resume and make it more appealing to the types of companies I will be applying for. This kind of self-study is essential for your career advancement. This is why I look for opportunities to learn directly with industry-standard software and other services.

Learning data analysis using Google Cloud

My internship at Google has given me ample opportunities to improve my data analysis skills through Google Cloud services. This blog post focuses on two of these services: BigQuery, and Data Studio.

What’s BigQuery?

BigQuery allows companies to run analytics on large data sets from the cloud. It is also a great place to learn and practice SQL (the language used for analysing data). BigQuery’s “getting started” process is very easy and saves students a lot of time. Instead of installing database software and sourcing data to load it into tables, log in to the BigQuery Sandbox to immediately begin writing SQL queries or copying samples to analyze the data provided by the Google Cloud Public Datasets program. You’ll be able to see the difference for yourself soon! ).

What’s Data Studio?

Data Studio integrates with BigQuery to allow you to visualize data in interactive and customizable tables, dashboards and reports. It can be used to visualize the results from your SQL queries. However, it is also useful for sharing insight with non-technical users.

Data Studio is part of Google Cloud so you don’t need to export processed queries to another tool. Direct connections to BigQuery can be used to visualize data. This saves time and eliminates the need to worry about file compatibility and size.

BigQuery and Data Studio are free to use within the Google Cloud Free Tier. The free tier allows users to store a minimum amount of data (if you wish to upload your own data) and it also processes a set number of queries per month. A BigQuery “sandbox”, which is free, can be created. It doesn’t need a credit card and you don’t have to pay any fees to set it up.

BigQuery and Data Studio are free to use. Let’s now talk about their applicability. BigQuery and Data Studio can be used in many industries today for production workloads. You can search BigQuery and Data Studio on LinkedIn to see what I mean.

Get started with BigQuery or Data Studio

Let’s get on with the business. Let me show you how easy it is to use both these tools. Here’s a quick tutorial to help you get started using BigQuery and Data Studio.

Let’s look at an example situation that BigQuery can solve.

Congratulations! This is a new intern that was recently hired by insists that new employees are allowed to come in for training programs for the first few weeks. You must show up on-time.’s office is located in New York City. There is no parking available nearby. So you know that New York City’s public bike program has been reinstituted. You have decided to use bikesharing to get to work.

You must arrive on time at work. Here are some key questions to help you answer these questions.

  • What stations are nearby that have bicycles you can use in morning?
  • Is there a drop-off point that is close to the office?
  • Which stations are busiest?

These questions could be answered using a public dataset. BigQuery offers tons of datasets that you can use at no cost. This example uses the New York Citi Bike dataset.

How to get set up

    1. First, create a BigQuery Sandbox. This is basically an environment that you can use to do your work. Follow these steps to set one up:
    2. Go to the BigQuery page in the Google Cloud console.
    3. Click +Add Data in the Explorer pane > Pin a Project > Enter the project name.
    4. Type “bigquery-public-data” and click Pin. This project includes all datasets that are available in the public datasets programme.
    5. Expand the bigquery–public-data project to see the underlying data. Scroll down until you find “new_york_citibike”
    6. Click to highlight the data or to expand the citibike_stations/citibike_trips tables. To see the schema and preview of the data, highlight the tables.

Visualize the results
BigQuery’s great feature is Data Studio, which allows you to visualize your results with ease. Just click the Explore Data button on the query results page! This will help you get a better understanding of the query you made.

If you’re interested in trying Data Studio out for yourself, I suggest following this tutorial. It also covers bikeshare trips, but this time it is in Austin, Texas!

Next step

It’s that easy! Google Cloud is simple to use and learn, so you spend less time “getting going” and more time analysing data and creating visualizations. It is easy to see the benefits of using this tool in your professional and personal tech development. There are many ways you can improve your data science skills, such as BigQuery, and help your early career in data science.

Retailers must “always be pivoting” These are three steps to keep them going.



Retailers have been told for years that they need to embrace new technologies, trends and imperatives such as online shopping, mobile apps and omnichannel. Retailers adopted many of these technologies in an effort to grow and stabilize their business. However, they soon realized that there were more options available.

The pandemic, rising social movements, and harsher weather followed. These disruptions were not all bad for retailers, but some retailers were better prepared than others. This revealed a universal truth: adaptability is the key to survival and growth.

The retail environment today presents both new and existing challenges to specialty and department store merchants. 88% of purchases were made in a physical store. It’s now closer to 59% with the rest done online or via other omni-channels.

It can feel like ABP is the mantra of all times: Always be pivoting.

The key question isn’t how to maintain momentum and agility, but how to do this without draining your workforce, inventory, or profits. Now, the pivot is a fact. It is what you do that matters.

It is important to adapt quickly and have the right technology in place that allows for seamless scaling.

They must be able leverage real-time insight and improve customer experiences quickly, online and offline (not to mention the growing hybridization of AR and VR). To create engaging customer and associate experiences, they need to modernize their stores. They must improve operations to allow rapid scaling from full operations to digital-only offerings.

Google Cloud has three essential innovations to help retailers reach these goals: Demand forecasting that uses the power of data analysis and artificial intelligence; enhanced product Discovery to increase conversion across channels; and tools to help create the modern shop experience.

These are just a few of the ways that we can help you pivot.

Pivot point 1 – Harnessing AI and data for demand forecasting using Vertex AI

When it comes to building organizational flexibility, one of the biggest challenges retailers face is managing inventory and the supply chains.

The pandemic has created a global supply chain crisis that is causing unprecedented demand and logistical problems. This crisis has made it more difficult for retailers to assess demand and availability. Even in normal times, inventory mismanagement can lead to a trillion-dollar problem according IHL Group. This is because it costs $634 billion each year in lost sales, and overstocks cost $472 billion in lost revenue due to markdowns.

Optimizing your supply chain can also lead to higher profits. McKinsey estimates that a 10%-20% improvement in the accuracy of retail supply chain forecasting will result in a 5% decrease in inventory costs and a 2 to 3% increase revenue.

Some of the problems associated with demand forecasting are:

  • Low accuracy can lead to excessive inventory and missed sales. This puts pressure on fragile supply chains.
  • Real drivers of product demand cannot be included because it is difficult to model large datasets using traditional methods.
  • Poor accuracy in new product launches, and products with low or intermittent demand.
  • Complex models can be difficult to understand and lead to poor product allocations and low returns on investments on promotions.
  • Different departments may use different methods which can lead to miscommunications and costly reconciliation mistakes.

AI-based demand forecasting techniques are possible. VertexAI Forecast helps retailers maintain greater inventory flexibility through the incorporation of machine learning into existing systems. Vertex AI and machine learning-based forecasting models such as Vertex AI can process large amounts of disparate data and drive analytics to automatically adjust for new information.

These machine learning models allow retailers to not only use historical sales data but also to access close-to-real-time data like marketing campaigns and web actions such as a customer clicking on the “add to Cart” button on a site, local weather forecasts, etc.

Pivot point 2 – Enhanced product discovery via AI-powered search, and recommendations

Customers will look elsewhere if they can’t find the product they need online or in-store. This is a simple but powerful statement that has profound implications.

The research by The Harris Poll, and Google Cloud found that 95% of search results received by consumers were not relevant to their searches over a period of six months. A majority of search results that are not relevant to their search query have been rejected by 85% of consumers. 74% said they would avoid websites they had encountered problems with in the past.

Search abandonment is a problem that causes retailers to lose over $300 billion annually. This happens when a customer searches for a product but doesn’t find it on the retailer’s site. Our product discovery services will help you find the right products for the right customers at the right time. These solutions include:

  • Vision Product Search is like bringing an augmented reality experience from Google Lens into a retailer’s mobile app. Shoppers and associates in retail stores can search for similar products by using images they have taken or found online. They will receive a ranking list of similar items.
  • RecommendationsAI allows retailers to make highly customized recommendations across multiple channels.
  • RetailSearch provides Google-quality results for a retailer’s website and mobile apps.

Google Cloud powers all three. It leverages Google’s advanced understanding and intent of user contexts and intents, using technology to provide seamless shopping experiences for every customer. These combined capabilities allow retailers to reduce abandonment of searches and increase conversions across all their digital properties.

Pivot point #3: Building a modern store

The store is no longer a place to browse and buy. Stores must be flexible and able to adapt to changing conditions. Modern stores must offer multiple functions: mini-fulfillment and returns centers, recommendation engines, shopping destinations, fun places to work and many more.

As retail businesses had to embrace Omnichannel, so too are stores becoming omnichannel centres on their own. They combine the digital and physical in one location. Retailers can make use of physical stores to provide superior customer experiences. This will require greater collaboration and cooperation among stores, digital, as well as tech infrastructure teams. It builds on their agile work together.

It’s all about making our physical spaces more digitally compatible. Google Cloud is a way to help physical locations upgrade their infrastructure and modernize customer-facing and internal applications.

It’s like a new OS being released for your phone. Although it’s the same box, the user experience can be quite different. You can now extend this idea to a digitally enabled shop. The team can create new experiences by simply updating the store’s display, interfaces and tools. This could be for sales displays or fulfillment, as well as employee engagement.

This approach can result in simplified customer experiences and better store associates. Cloud solutions can be used to automate ordering, replenishment and fulfillment for omnichannel orders.

Customers can use similar tools to find personalized products online. This allows them to browse, explore and even create a customized shopping experience.

Technology can maximize the impact of store associates. It can provide them with expertise that drives value-added service and productivity. This will also help to reduce overhead costs. Customers should have frictionless checkout and be able make secure, reliable transactions.

Google Cloud can help retailers transform

Modern tools are essential for retailers to be able to pivot and adapt to changing consumer demands. Every company can become a tech company, we believe. Every decision is data driven. Every store can be both digital and physical simultaneously. Every worker is a tech worker.

Google Cloud helps retailers solve the most difficult problems. Our unique capabilities include the ability to manage large amounts of unstructured information and advanced capabilities in AI/ML. Our solutions and products help retailers to focus on the most important things, such as improving operations and capturing digital and multichannel revenue.

Cloud Computing: 12 Benefits


Cloud computing has existed for nearly two decades. Despite the fact that it offers many benefits, including cost-benefits and competitive advantages, large numbers of businesses continue to use it without any realizing them. A study by International Data Group found that 69% of businesses already use cloud technology in some capacity, while 18% plan to adopt cloud computing solutions at some point. According to Dell, companies that invest heavily in big data, cloud and mobility enjoy a 53% higher rate of revenue growth than their competition.

This data shows that a growing number of tech-savvy companies and industry leaders are realizing the many benefits of cloud computing. They are also using the technology to improve their businesses, serve customers better, and increase their overall profit margins.

This all seems to suggest that, given the direction the industry is heading in, it’s never been more important to get your head in cloud.

Cloud computing has been gaining popularity in recent years. It is becoming increasingly difficult for individuals and companies to keep their important programs and data up-to-date on their own computers, due to the rapid increase in data usage that has accompanied the transition into the digital 21st Century. This problem has been solved for almost as long as the internet. However, it has just recently become widely used by businesses.

Cloud computing works on the same principle as web-based email client. Users can access all the features and files without needing to store the majority of the system on their computers. Many people use cloud computing services, even though they don’t realize it. Gmail, Google Drive and TurboTax are all cloud-based apps.

Users send their personal data to the cloud-hosted server, which stores it for later access. These applications are useful for personal use but even more so for businesses who need to be able access large quantities of data via an online connection.

Employees can, for example, access customer information using cloud-based CRM software such Salesforce via their smartphone or tablet at work or on the road. They can also share this information quickly with authorized parties around the globe.

There are still leaders who are hesitant to commit to cloud computing solutions for their companies. We’d like to share 12 benefits of cloud computing with you in a few moments.

  1. Cost Savings
  2. Security
  3. Flexibility
  4. Mobility
  5. Insight
  6. Greater Collaboration
  7. Quality Control
  8. Disaster Recovery
  9. Loss Prevention
  10. Automatic Software Updates
  11. Competitive
  12. Sustainability

  1. Cost savings: If you’re worried about the cost of switching to cloud computing, don’t be. 20% of companies are worried about the initial costs of setting up a cloud-based server. However, when weighing the benefits and drawbacks of cloud computing, it is important to look beyond the initial cost. They also need to consider the ROI.  Cloud computing will allow you to save time and money on project startup by allowing you easy access your company’s data. Cloud-computing services can be paid as you go, so it’s not a concern that they will end up paying for features they don’t need or want. If you don’t use the cloud’s features, you won’t be spending money.
    Pay-as-you go also applies to data storage space required to service your clients and stakeholders. This means you will get the exact amount you need and you won’t be charged extra for space you don’t use. These factors combine to bring lower costs and better returns. Bitglass surveyed half of the CIOs and IT leaders who reported that cloud-based applications resulted in cost savings for 2015.

  2. Security: Many organizations are concerned about security when adopting cloud computing solutions. How can you be sure that files, programs and other data are protected if they’re not kept securely on-site? What’s to stop a cybercriminal accessing your data remotely? Actually, quite a lot.
    One, cloud hosts are responsible for security monitoring. This is a significantly better option than an in-house system where the organization has to split its efforts among a multitude of IT issues. While most businesses aren’t open to the possibility of data theft within their organization, a shockingly large percentage of data thefts happen internally and are committed by employees. It can be safer to keep sensitive information off-site when this is true. This is obviously abstract. Let’s look at some solid statistics.
    RapidScale claims that 94% of businesses experienced an increase in security following switching to the cloud. 91% also claimed that the cloud makes compliance easier for them. This increased security can be attributed to the encryption of data that is transmitted over networks and stored on databases. Encryption makes your information less accessible to hackers and anyone else who is not authorized to see it. Cloud-based services can have different security settings depending on who is using them. Only 9% of cloud users can claim the same. 20% of cloud users claim that disaster recovery is possible in less than four hours.
  3. Flexibility Your business only has a limited amount of time to focus on all its responsibilities. Your current IT solutions will make it difficult for you to focus on customer satisfaction and business goals if you have to pay too much attention to data storage and computer issues. Relying on an outside company to manage your IT infrastructure and hosting will allow you to spend more time on the areas of your business that directly impact your bottom line.
    Cloud hosting offers more flexibility than hosting on a local server. Cloud-based services can provide extra bandwidth instantly and without the need for a costly (and complex) upgrade to your IT infrastructure. The increased freedom and flexibility of a cloud-based service can significantly improve the efficiency of your company. An InformationWeek survey found that 65% of respondents believed that “the ability to rapidly meet business needs” was the main reason a company should migrate to a cloud environment.
  4. MobilityCloud computing provides mobile access to corporate data through smartphones and other devices. This is an excellent way to make sure that everyone is included, especially considering more than 2.6 billion smartphones worldwide today. This feature is great for staff with hectic schedules or those who live far from the office. They can keep up with their clients and coworkers instantly.
    For better work-life balance, the cloud can be used to provide easily accessible information to remote, freelance, and travel sales staff. It’s no surprise that companies with employee satisfaction as a priority are 24% more likely than others to increase cloud usage.

  5. Insight As we continue to move into the digital age it is becoming increasingly clear that “knowledge is Power” is no longer true. Instead, the phrase “Data is Money” is now more accurate: “Data Is Money.” There are valuable, actionable nuggets hidden within the millions of bits and data surrounding your business transactions and business processes. They just need to be discovered and taken care of. It can be difficult to sort through all that data and find the kernels if you don’t have the right cloud computing solution.
    Cloud-based storage solutions often offer cloud analytics, which allows you to see your data from a bird’s eye view. You can track your data and create customized reports to analyze the information across the organization. These insights can help you increase efficiency and create action plans to achieve organizational goals. Sunny Delight, for example, was able to increase its profits by approximately $2 million per year and reduce $195,000 in personnel costs using cloud-based business insight.

  6. Increased collaboration: If your company has more than two employees, you should make collaboration a priority. It is not worth having a team that is not able to work together as a team. Collaboration is made easy by cloud computing. Cloud computing makes collaboration easy and secure. Team members can easily view and share information securely through a cloud-based platform. Cloud-based services offer social spaces that allow employees to collaborate and connect across the organization. This increases interest and engagement. Although collaboration is possible with or without cloud computing, it won’t be as efficient nor easy as with a cloud-computing system.

  7. Quality Control Poor quality reporting is one of the biggest hindrances to a company’s success. All documents can be stored in one location and in one format in a cloud-based platform. You can keep data consistent, avoid human error, and keep track of revisions and updates by having everyone access the same information. However, if information is managed in silos, employees may accidentally save different versions of documents. This can lead to confusion and diluted data.

  8. Disaster Recovery Control is a key factor in determining the success of any business. No matter how well-informed your company is about its processes, there are always things out of your hands. In today’s market, even small amounts of downtime can have a devastating effect. Your services’ downtime can lead to loss of productivity, revenue, or a bad reputation.

    While there are no ways to avoid or anticipate disasters that could harm your company, there are things you can do to speed up your recovery. Cloud-based services offer fast data recovery in all types of emergency situations, including natural disasters and power outages. 20% claim that cloud-based services can provide disaster recovery within four hours while only 9% of noncloud users could do the same. 43% said that they intend to invest in cloud-based disaster recovery strategies.
  9. Prevention of Data Loss: If your company doesn’t invest in cloud computing, all your data will be tied to the computers in which it resides. Although this may seem like a problem it could lead to permanent data loss if the hardware fails. Computer malfunctions can be caused by many factors, including viruses, age-related hardware wear, user error, and simple human error. They can also be lost or stolen, even though they are often misplaced.
    You could lose all your information if you don’t have access to the cloud. A cloud-based server means that all information uploaded to it is safe and accessible from any computer connected to the internet.

  10. Automatic Software Upgrades: For those with a lot of work to do, waiting for system updates to be applied is more frustrating than anything else. Cloud-based software updates themselves and does not require IT departments to manually update the entire organization. This saves IT staff valuable time and money on consulting outside IT. PCWorld reports that half of cloud users cite the cloud’s ability to use fewer IT resources internally as a benefit.

  11. Competitive edge: While cloud computing is becoming more popular, there are still people who prefer to keep things local. While they have the option to do so, it puts them at a disadvantage when compared with others who can access the cloud. You’ll learn more quickly if you adopt a cloud-based solution than your competitors. According to a Verizon survey, 77% believe cloud technology gives them a competitive edge. 16% consider this advantage significant.

  12. Sustainability:Given today’s environmental state, it’s not enough for businesses to simply place a recycle bin in their breakroom. They have to show that they are doing their part to save the environment. Sustainable business solutions must address wastefulness at all levels. Cloud hosting is more eco-friendly and leaves less carbon footprint.

    Cloud infrastructures promote environmental proactivity by powering virtual services instead of physical products and hardware. This reduces paper waste and improves energy efficiency. It also allows employees to access the internet from anywhere. Based on cloud computing and other virtual data options, a Pike Research report forecasts that data center energy consumption will decrease by 31% between 2010 and 2020.