Data Engineer – Outside IR35
Venesky-Brown’s client, a public sector organisation in Edinburgh, is currently looking to recruit a Data Engineer for an initial 6 month contract with potential to extend on a rate of £484/day (Outside IR35). This role will be a hybrid of working at home and in the office.
Responsibilities:
– Ensure technical resiliency of all data integration solutions and services
– Support the delivery of ongoing data engineering activities
– Enhance and support existing data product outputs for both internal and external customers.
– Collaborate with technical colleagues across the organisation to design robust data integration solutions
– Demonstrate excellent, sustainable and collaborative software development practice that’s focused on delivering highly readable, maintainable and appropriate artefacts.
– Actively participate in all team events, leading where specialist knowledge is required and supporting the team to improve their process through inspection and adaptation.
– Troubleshooting and fixing development and production problems across multiple environments and operating platforms.
– Engage with the wider communities of practice and interest to share knowledge, technique and experience Requirement for Data Engineering services
– Ensures high quality of developed solutions through development and maintenance of unit tests – with appropriate code coverage – and code analysis using code quality tools,
– Ensure that developed software complies with non-functional software requirements such as accessibility, security, UI/UX, performance, maintainability and deployability.
– Routinely use collaborative development practices such as pairing and mobbing techniques in programming, code reviews, system design and requirements analysis, etc.
– Support and deliver the disaster recover assurance of digital services, striving towards a sustainable Recovery Time Objective of 2hrs and Recovery Point objective of zero. This will be assured at 6 weekend points over the course of a FY year.
Essential Skills:
– Python
– PostgreSQL
– REST APIs
– Modern DevOps and CI/CD practices and tooling including Docker, GitLab CI, AWS CodePipeline, AWS CDK and AWS CloudFormation
– Expertise in SQL, data transformation and analysis
– Delivering high quality software collaboratively in high-performing, cross functional development teams.
– Experience implementing data ETLs, data streaming systems and data integration solutions
– Experience working in the Agile delivery models – such as Scrum and/or Kanban frameworks.
Desirable Skills:
– Data warehousing
– Hybrid on-premises/cloud solutions
– AWS Glue, Step Functions, Lambda functions, S3, RDS, Data Migration Service
– Using testing tools for unit testing, including system test automation frameworks
– Openshift
– PostGIS for PostgreSQL
– Designing and implementing solutions using service and event-based architectures
– Monitoring, alerting, intelligence tools and processes, including Grafana
– Human-centred, research-driven, inclusive design practices
– TDD
If you would like to hear more about this opportunity please get in touch.
Responsibilities:
– Ensure technical resiliency of all data integration solutions and services
– Support the delivery of ongoing data engineering activities
– Enhance and support existing data product outputs for both internal and external customers.
– Collaborate with technical colleagues across the organisation to design robust data integration solutions
– Demonstrate excellent, sustainable and collaborative software development practice that’s focused on delivering highly readable, maintainable and appropriate artefacts.
– Actively participate in all team events, leading where specialist knowledge is required and supporting the team to improve their process through inspection and adaptation.
– Troubleshooting and fixing development and production problems across multiple environments and operating platforms.
– Engage with the wider communities of practice and interest to share knowledge, technique and experience Requirement for Data Engineering services
– Ensures high quality of developed solutions through development and maintenance of unit tests – with appropriate code coverage – and code analysis using code quality tools,
– Ensure that developed software complies with non-functional software requirements such as accessibility, security, UI/UX, performance, maintainability and deployability.
– Routinely use collaborative development practices such as pairing and mobbing techniques in programming, code reviews, system design and requirements analysis, etc.
– Support and deliver the disaster recover assurance of digital services, striving towards a sustainable Recovery Time Objective of 2hrs and Recovery Point objective of zero. This will be assured at 6 weekend points over the course of a FY year.
Essential Skills:
– Python
– PostgreSQL
– REST APIs
– Modern DevOps and CI/CD practices and tooling including Docker, GitLab CI, AWS CodePipeline, AWS CDK and AWS CloudFormation
– Expertise in SQL, data transformation and analysis
– Delivering high quality software collaboratively in high-performing, cross functional development teams.
– Experience implementing data ETLs, data streaming systems and data integration solutions
– Experience working in the Agile delivery models – such as Scrum and/or Kanban frameworks.
Desirable Skills:
– Data warehousing
– Hybrid on-premises/cloud solutions
– AWS Glue, Step Functions, Lambda functions, S3, RDS, Data Migration Service
– Using testing tools for unit testing, including system test automation frameworks
– Openshift
– PostGIS for PostgreSQL
– Designing and implementing solutions using service and event-based architectures
– Monitoring, alerting, intelligence tools and processes, including Grafana
– Human-centred, research-driven, inclusive design practices
– TDD
If you would like to hear more about this opportunity please get in touch.