IBM TechXchange AWS Cloud Native User Group

AWS Cloud Native User Group

This group focuses on AWS Cloud Native services. All the discussions, blogs and events will be very specific to AWS Cloud.

 View Only

How IBM Consulting's GICA Smart City Solution Enables Smarter Municipal Governance and provides foundation for AI- Driven Urban Services

By Diego Colombatto posted 23 hours ago

  

City leaders and municipal governments have an opportunity to enhance urban operations and citizen services through intelligent, data-driven governance. As municipalities evolve toward smarter, more sustainable urban environments, they can unlock significant operational efficiencies and cost savings by creating unified digital representations of their infrastructure and services.

IBM Consulting's GICA (Garnet IoT Context-Aware) solution built on AWS enables this transformation by connecting diverse urban data sources—from IoT sensors to legacy systems—into a comprehensive digital twin of the city. This solution empowers municipalities to optimize resource allocation, make real-time data-driven decisions, and deliver enhanced citizen services while maintaining vendor independence through open standards compliance. By using FIWARE (Future Internet-ware) and the Garnet Framework, cities can break down departmental silos and create the contextual intelligence necessary for true urban innovation.

In this blog, you will learn how GICA creates unified digital twins integrating data from traffic sensors, waste management systems, structural health monitors, and other urban infrastructure. You will discover practical implementation strategies that have delivered 30-50% faster time-to-market and 25-40% cost reduction in development costs as we achieved in IBM GICA implementation for Madrid municipality. Additionally, you will explore real-world use cases demonstrating how cities use GICA to optimize everything from ornamental fountain operations to bridge maintenance scheduling. You will understand the technical architecture built on AWS services that ensures scalability, security, and seamless integration with existing municipal systems, along with actionable guidance for successful deployment in your city.

Enabling Urban Intelligence Through Open Standards

Cities can move from isolated services to unified, intelligent environments by connecting systems such as lighting, traffic, and waste collection. This integration enables holistic data management and cross-domain optimization. It provides rich, real-time context for AI-driven decisions, paving the way for autonomous “Agentic Cities.” Open standards ensure vendor independence and scalability, reducing costs and enabling rapid expansion. Strong security and privacy measures protect IoT networks and citizen data, building trust in digital services.

Open Standards Foundation: FIWARE and Garnet

FIWARE is an open-source framework supported by the European Commission. Born from public-private collaboration between the European Commission and industry partners, FIWARE serves as the de facto standard for smart cities. It enables interoperability, reusability, and vendor independence through its commitment to open standards.

The European Union actively supports FIWARE adoption through multiple initiatives. The European Commission's Open-Source Strategy 2020-2023 specifically encourages "Think Open" approaches, with FIWARE serving as a key driver for leveling the playing field across Europe. More than 300 cities from over 30 countries worldwide use FIWARE technologies to create smarter, more sustainable urban environments.

Key FIWARE Components for Smart Cities

  • ETSI NGSI-LD API: A standard for sharing context information in a distributed environment, designed to improve interoperability between different smart city, IoT, and other applications by using linked data principles and a RESTful API. It standardizes how real-world entities, their properties, geolocation, and relationships are described and managed.
  • Context Broker: The central component that receives, manages, and distributes context data as it changes in real-time.
  • Smart Data Models: Pre-defined, interoperable data models that ensure consistency across different city services and domains.

The Garnet Framework

The Garnet Framework powers living digital twins and context-aware solutions. It manages real-time context, adapting continuously to changes in the city. It transforms fragmented data into interconnected knowledge through dynamic graphs, enabling smarter recommendations. AWS Cloud Development Kit simplifies deployment on AWS infrastructure. Full compliance with ETSI NGSI-LD and FIWARE standards guarantees interoperability and openness.

IBM GICA Solution Overview

IBM Consulting's GICA solution integrates the Garnet Framework with AWS cloud services to deliver context-aware smart city solutions that enable true urban intelligence. This comprehensive approach supports the evolution to "Agentic Cities" where AI systems can operate with contextual understanding while ensuring compliance with EU recommendations.

IBM GICA solution creates a unified digital twin of the city by connecting diverse urban data sources through open standards. The solution establishes a semantic layer that enables different city services to understand and interact with each other, breaking down traditional silos that limit municipal efficiency.

The following visualization (Figure 1) demonstrates how GICA creates a comprehensive urban digital twin that integrates diverse data sources into a unified, contextually aware city representation. It illustrates how real-time IoT sensor data from anemometers and other devices can be visualized within a 3D Tiles urban environment.

Figure 1: IBM GICA Ornamental Hydraulic Installations

Figure 1: IBM GICA Ornamental Hydraulic Installations

Core Solution Components

  • Data Integration and Standardization: GICA transforms diverse data from IoT sensors, legacy systems, and municipal databases, ensuring consistent data formats across all city services.
  • Dynamic Context Management: The FIWARE Context Broker continuously manages context data and enables real-time responses through subscription-based notifications and automated alerts.
  • AI-Ready Foundation: Rich, contextual datasets provide the essential foundation for AI applications, machine learning modules, and predictive analytics that power intelligent urban services.
  • Citizen-Centric Services: Applications that put citizens at the center, from mobile apps providing real-time service information to digital engagement platforms that facilitate participatory governance.

Implementation Approach

IBM Consulting uses a phased approach. First, it assesses organizational and technical readiness, identifies priority use cases, and defines key performance indicators. Next, it deploys a pilot with a minimum viable use case, provides technical training, and validates results. Finally, it scales the solution by adding new use cases and refining features based on stakeholder feedback.

Use Cases and Applications

Current Implementations

Cities already use GICA to deliver smart services across multiple domains. They monitor ornamental fountains in real time, adjusting water flow automatically during adverse weather to improve safety and efficiency. They track structural health in bridges and overpasses, enabling proactive maintenance and reducing downtime. Street cleaning operations use GPS and sensors for route optimization and transparent reporting. Waste collection relies on IoT-enabled bin monitoring and dynamic scheduling to optimize routes. Traffic flow measurement uses inductive loop sensors to provide accurate, real-time data for planning and management.

Future Expansion Opportunities

GICA’s architecture supports growth into new domains. Smart lighting systems adjust dynamically based on pedestrian presence and environmental conditions. Municipal building management optimizes energy use and predicts maintenance needs. Emergency response coordination gains real-time situational awareness. Citizen engagement platforms enable digital participation and instant feedback. Environmental monitoring tracks air quality, noise pollution, and climate adaptation measures.

Technical Architecture: AWS-Powered Smart City Infrastructure

The GICA solution harnesses the breadth and depth of AWS services to deliver a robust, secure, and high-performance smart city platform, purpose-built to evolve in parallel with the growing demands of municipal operations and the rapid pace of technological innovation. By combining modular design principles with cloud-native scalability, the architecture ensures that cities can seamlessly integrate new capabilities, expand service coverage, and maintain operational excellence without compromising on governance, interoperability, or security. 

AWS Services Architecture

The following section provides a comprehensive view of the end-to-end data flow within the solution, illustrating how AWS services are orchestrated to create a modular, extensible, and standards-compliant smart city environment. This architecture enables the secure ingestion of heterogeneous data streams, their semantic transformation into NGSI-LD-compliant entities, and the contextualized publication of urban information to both municipal and enterprise platforms. The design ensures full system observability, traceability, and governance, while fostering an interoperable ecosystem that is ready to embrace the digital evolution of urban environments.

Amazon API Gateway 

At the core of the platform’s ingress layer, Amazon API Gateway provides two distinct and strategically segmented entry points, each tailored to specific security and traffic management requirements. The Public API is designed for external providers, secured through Amazon Cognito and JWT-based authentication, while the Private API is reserved for internal secure systems, accessible exclusively via a secure proxy layer within the organization’s trusted network perimeter.

By acting as the central control point for all inbound requests, API Gateway enforces authentication and fine-grained access control, applies throttling policies to protect against abuse, manages API versioning and routing, and ensures perimeter-level security with comprehensive request logging. Its native integration with AWS Lambda, Amazon SQS, and Amazon Cognito allows for seamless downstream processing, ensuring that every request entering the system is authenticated, filtered, and fully traceable from the very first hop. 

Amazon Cognito

Amazon Cognito is dedicated exclusively to managing authentication for external and public access, providing a secure and scalable identity layer without the operational overhead of managing IAM infrastructure. Through JWT-based authentication, automatic token expiration and renewal, and granular access control policies, Cognito enables the autonomous and secure onboarding of external telemetry providers, ensuring that only trusted and verified data sources can interact with the platform.

AWS Lambda (Input validation): Once a request passes through API Gateway, it is processed by the first AWS Lambda function, which serves as a critical validation and filtering stage. This function verifies authentication and role-based permissions (for public entry points), validates the payload’s structural integrity and semantic compliance (including schema adherence, timestamp accuracy, and entity definitions), and rejects any corrupted or incomplete messages with a controlled and meaningful error response. Valid messages are routed to Amazon SQS for asynchronous processing, while all interactions—whether successful or erroneous—are recorded as audit events (REQUEST, REPLY, ERROR, INFO), ensuring complete traceability and operational transparency. 

(Publish): Upon successful dequeuing from SQS, a second AWS Lambda function performs advanced validation of attributes, relationships, and metadata, enforces context consistency, applies duplicate detection and temporal logic controls, and optimizes write operations to prevent overloading the downstream broker. The validated and enriched data is then published to the Scorpio Context Broker for semantic management. 

Amazon Simple Queue Service (Amazon SQS): Amazon SQS functions as a decoupled buffering layer that absorbs high-traffic bursts, guarantees message durability, and provides automatic retry capabilities, enabling scalable and distributed consumption of messages. This design ensures that the system remains stable and responsive even when downstream services experience latency or temporary unavailability, effectively decoupling ingestion from processing and safeguarding against data loss. 

Amazon Dead Letter Queue (Amazon DLQ): To handle persistent processing failures, the architecture incorporates an Amazon DLQ, which isolates problematic messages after a defined number of retry attempts (up to 10) and routes them for manual review and resolution. This mechanism ensures that transient errors are retried automatically, while persistent issues are quarantined, preventing silent data loss and maintaining the integrity of the overall data flow. 

Scorpio Context Broker (NGSI-LD): Serving as the semantic core of the solution, the Scorpio Context Broker maintains the real-time state of urban assets, manages their attributes, relationships, and temporal context, and ensures full compliance with the NGSI-LD standard. This enables FIWARE-compliant interoperability, allowing municipal systems to exchange and consume contextual data seamlessly, thereby supporting informed decision-making and coordinated urban operations. 

AWS IoT Core serves as the central orchestration layer for all audit and telemetry events. Every component in the architecture that generates an audit record publishes it to IoT Core, which, through its powerful rules engine, inspects the payloads, evaluates metadata, and applies conditional logic to determine the appropriate routing path. This centralized orchestration allows the platform to handle diverse event types, from operational logs to NGSI-LD context changes, without embedding complex routing logic in each producer service. By consolidating event management in IoT Core, the architecture achieves consistent routing, simplified maintenance, and the flexibility to extend the pipeline with new destinations or processing rules as the platform evolves. 

Amazon Data Firehose acts as the high-throughput delivery channel that transports events from IoT Core to the storage and analytics layers. It buffers and batches incoming data to optimize delivery performance, automatically scaling to handle variable loads without data loss. Firehose can also perform inline transformations such as format conversion, compression, and encryption, ensuring that data is delivered in a storage-efficient and query-ready format. This capability is essential for smoothing ingestion spikes, reducing downstream processing costs, and improving the performance of analytical queries executed against the Data Lake. 

AWS Lambda functions within this pipeline are responsible for enriching and formatting the records before they are stored. As events pass through, Lambda adds contextual metadata such as source system identifiers, processing stage information, correlation IDs, and precise timestamps. It also normalizes the data to align with the platform’s historical data model and applies semantic transformations to preserve NGSI-LD context integrity. This enrichment process ensures that every stored record is self-contained, consistent, and ready for immediate use by downstream analytics, compliance checks, or forensic investigations, without requiring additional preprocessing.

Amazon Simple Storage Service (Amazon S3) serves as the immutable, long-term repository for all enriched audit and historical data. Leveraging features such as versioning for change tracking, and cross-region replication for disaster recovery, S3 guarantees the durability, integrity, and accessibility of stored records. The Data Lake is organized in a time-series partitioned structure, enabling efficient querying and retrieval for a wide range of use cases, including regulatory compliance reporting, operational forensics, historical trend analysis, and the training of machine learning models with high-quality, labeled datasets. 

By integrating AWS IoT Core, Amazon Data Firehose, AWS Lambda, and Amazon S3 into a cohesive audit and logging pipeline, the platform ensures that every event is captured with full fidelity, enriched with the necessary context, and stored in a secure, immutable format. This not only supports day-to-day operational monitoring but also empowers municipalities to meet stringent compliance requirements, perform deep retrospective analyses, and continuously improve their smart city services based on reliable historical evidence. 

End-to-End Audit & Data Lake Logging

The architecture incorporates a comprehensive, multi-stage audit and logging pipeline designed to capture, process, and store every significant event generated across the platform’s lifecycle. All events are classified into four categories—REQUEST, REPLY, INFO, and ERROR—and are processed through a dedicated flow that ensures complete traceability, compliance, and operational transparency.

NGSI-LD Subscriptions & Derived Flows

The architecture implements a robust NGSI-LD subscription mechanism that enables the platform to react to changes in the urban context in real time, preserve a complete historical record of all contextual data, and seamlessly integrate with external analytics ecosystems. This capability ensures that the smart city environment remains synchronized, auditable, and analytically rich, supporting both operational responsiveness and long-term strategic planning. 

Internal Notifications are triggered whenever the NGSI-LD Context Broker detects a change in the state of an entity, such as the arrival of new telemetry, an update to an asset’s status, or the generation of an alert. These events are securely routed back into the platform through Amazon API Gateway, which enforces authentication and access control before passing them to AWS Lambda. The Lambda function parses the event, enriches it with any necessary metadata, and creates a corresponding Notification entity within the Context Broker. This closed-loop process ensures that the digital representation of the city remains continuously synchronized with real-world changes, enabling municipal systems to maintain accurate situational awareness and respond promptly to evolving conditions.

Full Historical Capture in Amazon S3 is achieved by routing NGSI-LD event updates from the Context Broker into AWS IoT Core, which orchestrates their delivery to Amazon Data Firehose. Firehose batches and streams the events to AWS Lambda, where they are formatted according to the platform’s historical data standards and enriched with contextual metadata. The processed records are then stored in Amazon S3 as part of an immutable audit and time-series Data Lake. This historical repository supports compliance and governance requirements, enables time-series reconstruction for forensic analysis, and provides a rich dataset for business intelligence workloads, allowing city planners and analysts to identify trends, measure performance, and optimize urban operations over time.

Export to External Data Lake functionality allows the platform to share NGSI-LD change events with external analytics and machine learning environments in near real time. Events emitted by the Context Broker are ingested by AWS IoT Core, which bridges them to Amazon Kinesis for persistent, high-throughput streaming. AWS Lambda processes the streamed data, applying any necessary transformations or batching, before delivering it to the target external Data Lake. This integration enables municipalities and their partners to leverage advanced analytics, predictive modeling, and AI-driven insights without impacting the performance or security of the core smart city platform, fostering an open and collaborative data ecosystem.

Figure 2: IBM GICA solution architecture integrating FIWARE Context Broker with AWS cloud services for smart city data processing

Figure 2: IBM GICA solution architecture integrating FIWARE Context Broker with AWS cloud services for smart city data processing

High-Level Architecture Layers

The high-level architecture of the IBM GICA solution on AWS is organized into four logical layers, each leveraging AWS managed services to deliver secure, scalable, and intelligent urban operations.

Figure 3: High-level AWS architecture for GICA solution showing data ingestion, context management, analytics, and application layers

Figure 3: High-level AWS architecture for GICA solution showing data ingestion, context management, analytics, and application layers

  • Data Ingestion Layer: This layer connects IoT devices, urban sensors, and external systems to the platform. Amazon API Gateway serves as the secure entry point, authenticating external providers via Amazon Cognito and routing validated requests to Amazon SQS for asynchronous processing. High-throughput telemetry streams are ingested through Amazon Kinesis and Amazon IoT Core, with Amazon Data Firehose providing buffering, transformation, and delivery to downstream analytics and storage services. This layer ensures that all incoming data is securely ingested, validated, and made available for processing in real time. 
  • Analytics & Intelligence Layer: Once ingested, data flows into the analytics and intelligence layer, where it is processed, analyzed, and enriched to generate actionable insights. AWS Lambda executes event-driven processing logic, while Amazon EventBridge orchestrates event routing between services and triggers workflows based on contextual changes. Amazon Athena enables ad-hoc querying of historical data stored in Amazon S3 without the need for ETL pipelines, supporting rapid investigation and reporting. Amazon SageMaker provides a managed environment for building, training, and deploying machine learning models that can predict asset failures, detect anomalies in sensor data, and optimize resource allocation. These insights are fed back into the operational context, enabling proactive decision-making and automation. 
  • Context Management Layer: The context management layer maintains the real-time and historical state of all urban assets, ensuring that operational decisions are grounded in accurate, up-to-date, and semantically rich information. Amazon S3 serves as the immutable Data Lake for storing enriched historical records, audit logs, and time-series data, providing durability, compliance, and analytical readiness. The NGSI-LD Context Broker (Scorpio) manages the semantic model of the city, maintaining relationships, attributes, and temporal context for each entity, and enabling FIWARE-compliant interoperability across municipal systems. Within this layer, IBM Maximo operates as the authoritative master for corporate asset information, holding the definitive inventory of assets, their attributes, and lifecycle status.
  • Application Layer: The application layer exposes processed and contextualized information to end users, administrators, and external systems. Amazon API Gateway provides secure, authenticated access to APIs that deliver insights, dashboards, and operational controls.

Security and Compliance

IBM GICA implements a defense-in-depth security model, fully aligned with AWS best practices, but tailored to the operational and compliance needs of enterprise integration workloads. This ensures that sensitive data processed through IBM GICA is protected end-to-end.

Security Framework

Network Security:

  •  IBM GICA is deployed within a dedicated Amazon Virtual Private Cloud (VPC), ensuring complete network isolation for integration flows and API endpoints. 
  • Custom security groups and network ACLs are configured to restrict inbound and outbound traffic to only approved integration partners and services. 
  • AWS PrivateLink is leveraged to connect IBM GICA components to AWS services (e.g., Amazon S3, Amazon Relational Database Service (RDS)) without exposing traffic to the public internet, reducing attack surface.

Data Protection:

  • All encryption keys for IBM GICA data flows are managed via AWS Key Management Service (Amazon KMS).
  •  Integration payloads are encrypted in transit using TLS 1.2+, ensuring secure communication between APIs, connectors, and backend systems. 
  • Persistent data (e.g., configuration, logs, and integration state) is encrypted at rest in Amazon S3 and Amazon RDS, with access restricted to IBM GICA service roles.

Identity and Access Management:

  • AWS Identity and Access Management (IAM) is integrated with IBM GICA’s role-based access control (RBAC) to enforce least privilege across administrative, operational, and developer roles. 
  • Service accounts and API keys are scoped to specific integration tasks, preventing unauthorized cross-environment access. 
  • AWS CloudTrail is configured to log all API calls made by IBM GICA components, enabling full traceability for audits and incident investigations.

Compliance Support:

  • Persistent data (e.g., configuration, logs, and integration state) is encrypted at rest in Amazon S3 and Amazon RDS, with access restricted to IBM GICA service roles.
  •   IBM GICA’s deployment architecture supports GDPR by ensuring data residency in designated AWS regions and by implementing data minimization in integration flows. 

The solution inherits AWS’s SOC 2 and ISO 27001 certifications, while adding IBM’s own compliance controls for integration governance. 

Configurable data residency policies allow IBM GICA to meet local regulatory requirements, ensuring that integration data does not leave approved jurisdictions.

For complete security guidance, see the AWS Security, Identity, and Compliance architecture resources.

Benefits: Measurable Impact for Cities and Citizens

Operational Excellence

Based on previous implementation such as the one delivered for Madrid municipality, Cities implementing IBM GICA can achieve:

  • 30-50% Reduction in Time-to-Market: Automated AWS CDK deployment and pre-built, standardized components accelerate new use case implementation.
  • 25-40% Lower Development Costs: Reuse of open standards and validated modules reduces expenses compared to custom solutions.
  • Enhanced Scalability: Interoperable architecture reduces incremental costs when adding new domains or use cases.

Enhanced Governance and Service Delivery

  • Real-time Insights: City administrators gain actionable insights from contextualized data for evidence-based policy making.
  • Automated Response Systems: Millisecond response times for critical events improve continuity, reliability, and public safety.
  • Improved Service Delivery: Citizens experience faster response times, more reliable operations, and transparent service tracking.

Sustainability and Innovation

  • Resource Optimization: Intelligent systems optimize water, energy, and waste management, reducing environmental impact.
  • Support for EU Climate Mission: GICA solution aligns with the EU Mission for 100 climate-neutral and smart cities by 2030.
  • Innovation Acceleration: Open APIs and standards enable rapid development of new applications and services by third parties.
  • Economic Growth: Enhanced city services and digital infrastructure attract businesses and support job creation.

Summary

In this blog, you learned how municipalities can transform fragmented urban operations into unified, intelligent environments by creating digital twins that connect previously siloed systems—from traffic sensors to waste management to structural monitoring. You discovered how IBM GICA leverages open standards like FIWARE and ETSI NGSI-LD to enable cities to integrate diverse data sources while maintaining vendor independence and supporting future AI-driven decision-making.

Through practical examples ranging from automated fountain management during adverse weather to proactive bridge maintenance based on real-time structural data, you explored how contextual intelligence enables cities to optimize resource allocation, improve citizen services, and make evidence-based operational decisions. You also examined the phased implementation approach that allows municipalities to start with a single high-value use case and systematically expand across domains.

Begin your smart city transformation by identifying one municipal service that would benefit from real-time data integration and contextual awareness. Evaluate your existing IoT infrastructure and assess which use case—whether traffic management, waste collection, or infrastructure monitoring—offers the highest immediate value for a pilot implementation using the assessment framework outlined in this blog.

Authors

Luis Miguel Díaz: IBM Public Sector Associate Partner. Specialist in the Public Sector with more than 17 years of experience in senior consulting, program and project leadership, specialized in technological implementations and digital transformation. He is currently responsible for Public Sector accounts. He has led the IoT (Internet of Things) Practice at IBM Consulting Spain, covering asset optimization, smart cities, Edge Computing and building optimization. With a degree in Telecommunications Engineering from ETSI Valladolid, he has directed strategic projects in Spanish cities, implementing management systems for public services. His experience also spans sectors such as transportation, energy, retail and public administrations, with other specializations in SAP, AWS, Azure and change management.

Alfonso Peñaranda: IBM Public Sector Account Leader with more than 25 years of experience in senior consulting, and program and project leadership. Holding a Master's degree in Telecommunication Engineering from ETSIT UPM (Madrid), he has led transformation programs for public sector, transport, and telecommunications clients in Spain, ranging from inventory and asset management to provisioning transformation, smart cities, and cloud migration.

Gustavo Nogales: IBM Technical Architect. Professional with over 19 years of experience in consulting, leading integration and digital transformation projects in complex environments. Specialized in designing and deploying innovative solutions for the Public Sector.  As head of the innovation team, he has led initiatives in IoT, API management, smart cities, asset optimization, and real-time data processing, building secure, scalable, and interoperability-focused architectures.  Holding a degree in Computer Engineering from the University of Alcalá, he has directed strategic projects in Spanish municipalities, integrating public service management systems and deploying advanced sensor networks. His experience also spans sectors such as telecommunications, textile, financial and payment services, and public administration, with additional expertise in system integration, cloud migration, and security architecture.

Diego Colombatto: As AWS Principal Partner Solution Architect he helps customers and partners translate business needs into solutions using AWS technology. His expertise domains are AI, Analytics and Marketplaces, where he enjoys understanding and demystifying latest patterns and technologies, to evaluate and propose them for customers’ solutions. In his spare time, you can find him happily looking after his kids, cooking, trading and trekking on beautiful Italian Alps.

José Ángel Bermúdez Cortés: IBM Alliances & Business Development Director. Business and technology leader with more than 20 years of experience in consulting, business development, and digital transformation across the Travel & Hospitality, Energy, and Financial Services sectors. He currently leads cloud and alliance initiatives at IBM for Spain, Portugal, and Greece, where he established and scaled the IBM–AWS Strategic Alliance, positioning it among the top-performing partnerships in EMEA. Previously, at Indra, he directed the Digital Airline Practice and managed key accounts such as International Airlines Group (IAG), achieving significant business growth through digital innovation programs. With a background in Computer Science Engineering and a Master’s Degree in Cybersecurity, he holds certifications in PMI-PgMP, PMP, and AWS Solutions Architect, and has completed executive programs at IE Business School, Stanford, and Berkeley. His expertise includes cloud strategy, AI adoption, customer experience transformation, and strategic alliance management.

1 comment
19 views

Permalink