Google Cloud

How to secure your data with Google Cloud

[Multicloud]

Discover our BigData services on Google Cloud

Secure your data and take advantage of the possibilities offered by Google Cloud, whether you’re using a hybrid architecture or 100% Google Cloud.
Our certified experts will work with you from start to finish, deploying the solution best suited to your needs in just a few days, while taking into account your existing systems and security requirements.

 

(Option 1) Backup only  (Option 2) Manual backup and disaster recovery (Option 3) Automatic backup and disaster recovery
Capture your “on-promise” data and store it in your GCP (Persistent Disk, Nearline, Coldline) storage and perform incremental backup for low RTO.

Manage and define sophisticated data capture and data flow across multiple use cases and locations. Define your own backup model incorporating your SLA target

Application instances in 2nd region in shutdown mode

Ensure high availability of your captured data “on-promise” against infrastructure failures

Prepare for DR with a GCP region with persistent disk snapshots (incremental backup).

Retain the Vmware flexibility model and benefit from the high availability of the cloud.

All DR instances in the 2nd region on active standby

Application servers and databases :

  • Infrastructure availability ready to evolve
  • Backup application servers in Google Cloud infrastructure
  • Database replication using native replication features

Scalability :

  • Solution fully customizable to customer needs.
  • Use of native hybrid GCP cloud features and actifio solutions.
(Option 1) Backup only (Option 2) Sauvegarde et reprise après sinistre manuelle (Option 3) Sauvegarde et reprise après sinistre automatique
(Option 1) Backup onlyAgent-free application server backup, based on Virtual Server Agent (CV function).

Complete backup of your VM-level files and their granularity, with snapshots stored in bi-regional or multi-regional cloud storage locations to guarantee high availability.

2nd region application bodies in stop mode

Application servers and databases:

  • High availability between zones for protection against zonal failures
  • DR with inter-regional persistent disk snapshots
  • Database replication using native replication features.

File shares :

  • NFS cluster based on NetApp CVO or Suse/Red Hat
    Asynchronous replication for DR with Rsync
  • Can also take advantage of inter-regional snapshots of connected persistent disks.
Application instances in the 2nd region on active standby

Application servers and databases:

  • High availability between zones for protection against zonal failures
  • Backup application servers in DR region
  • Database replication using native replication features

File shares :

  • NFS cluster based on NetApp CVO or Suse/Red Hat with rsync
  • Regional high availability and interregional disaster recovery with asynchronous replication using Linux Rsync tools

A clear process for an end-to-end approach

Assessment

Our teams receive your requirements and analyze your existing infrastructures (security, RTP, RPO, network connectivity), as well as your data backup policy (data retention, frequency, volume, etc.).
This is an essential step in defining the prerequisites for implementing new IT resilience solutions.

Solution

Choice of connectivity solution and Google Cloud storage services according to use cases:

  • Long-term backup (archiving)
  • Security policy and encryption
  • Evaluation and optimization of your storage costs

Execution

  • Subscription to Google Cloud (if not already subscribed) and creation of storage facilities
  • Configuration of backup batches according to data size, criticality, etc.
  • Data security
  • Restore tests (image mode, granular mode)

Discover our use cases

Carrefour Belgium wanted to modernize its data chain and optimize the utilization of its data.

Our solution

Migration to Google Cloud Platform (GCP) via Cloudera enables secure and open storage and processing of data.

Result

Carrefour can leverage its data more efficiently and utilize Google services such as stock reporting, data science, Vertex AI, and BigQuery.

Conrad was seeking to standardize, manage, and efficiently share its data across its services and entities in a simple and effective manner.

Our solution

An architecture of microservices packaged in containers, and then executed in Kubernetes, enabled high throughput processing. Once cleansed, the data is distributed, processed, and stored through other microservices.

Result

A Big Data platform that makes structured data from all connected sources available in real-time.

Support for ad hoc queries through BigQuery.
Elimination of data silos.
Self-service approach enabling simplified data analysis and dashboard creation.