The Role
The Integration Architect is responsible for designing, governing, and delivering enterprise-scale integration solutions across distributed systems. This role requires deep expertise in event-driven architecture (EDA), Real Time streaming, and cloud-native integration patterns using Kafka and AWS messaging/streaming services such as EventBridge, SQS, SNS, Kinesis.
The Integration Architect partners with engineering, product, and cloud teams to create scalable, secure, and resilient integration landscapes.
Your responsibilities:
- Define enterprise integration architecture using event-driven, microservices, and Real Time streaming patterns.
- Architect solutions using Kafka, AWS EventBridge, SQS/SNS, Kinesis Streams/Firehose, and Kafka Connect.
- Establish integration standards, best practices, reusable frameworks, and governance models.
- Design solutions that ensure high availability, scalability, observability, and security.
- Evaluate system integration options and recommend optimal patterns (Pub/Sub, CQRS, event sourcing, streaming analytics, request-response APIs, batch).
- Lead the end-to-end delivery of integration platforms and streaming pipelines.
- Define event schemas, streaming topologies, routing logic, partitions, consumer groups, and throughput targets.
- Guide development teams in building producers, consumers, connectors, and stream processing applications.
- Review designs/code to ensure alignment with architecture guidelines.
- Architect integration workloads using:
- AWS Kinesis Streams & Kinesis Firehose
- AWS EventBridge event bus
- SQS/SNS for messaging patterns
- Kafka clusters (Confluent, MSK, or open-source)
- Work closely with cloud engineering teams on infrastructure design, IaC (Terraform/CloudFormation), performance tuning, and cost optimization.
- Implement monitoring using CloudWatch, Grafana, Prometheus, or OpenTelemetry.
- Enforce integration security practices:
- Authentication/Authorization
- IAM policies
- Encryption at rest/in transit
- Data governance & lineage
- Ensure solutions meet RPO, RTO, resiliency, disaster recovery, and failover requirements.
- Establish observability using tracing, logging, alerting, and dashboards.
- Collaborate with product owners, domain architects, delivery managers, and business stakeholders.
- Translate business requirements into scalable integration architectures.
- Provide technical leadership across teams and mentor integration engineers.
Essential skills/knowledge/experience:
- Strong hands-on experience with Kafka:
- Topics, partitions, consumer groups
- Kafka Streams, ksqlDB
- Schema Registry & Avro/JSON/Protobuf
- Kafka Connect connectors
- Deep expertise in AWS streaming & messaging:
- Amazon Kinesis Data Streams, Firehose
- EventBridge (rules, event buses, routing)
- SQS/SNS with dead-letter queues
- AWS Lambda event-based integrations
- Experience designing event-driven and microservices architectures.
- Strong knowledge of:
- API integration patterns (REST, GraphQL)
- ETL/ELT and data pipelines
- Distributed system design
- High-throughput and low-latency data streaming
- Good understanding of Java/Python/Node.js for integration logic.
- Familiarity with containerization & orchestration (Docker, Kubernetes).
- Working knowledge of CI/CD, DevOps, IaC.
- Architectural thinking with strong problem-solving abilities.
- Ability to lead teams, mentor developers, and influence decisions.
- Strong communication and stakeholder engagement skills.
- Experience working in Agile environments.