Azure Data Engineer (Databricks)

慨正橡扯
London
9 months ago
Applications closed

Related Jobs

View all jobs

Azure Data Engineer

Azure Data Engineer

Azure Data Engineer

Azure Data Engineer - £500 - Hybrid

Azure Data Engineer (Databricks)

Azure Data Engineer / BI Developer

Azure Data Engineer (Databricks) Joining Capco meansjoining an organisation that is committed to an inclusive workingenvironment where you’re encouraged to #BeYourselfAtWork. Wecelebrate individuality and recognize that diversity and inclusion,in all forms, is critical to success. It’s important to us that werecruit and develop as diverse a range of talent as we can and webelieve that everyone brings something different to the table – sowe’d love to know what makes you different. Such differences maymean we need to make changes to our process to allow you the bestpossible platform to succeed, and we are happy to cater to anyreasonable adjustments you may require. You will find the sectionto let us know of these at the bottom of your application form oryou can mention it directly to your recruiter at any stage and theywill be happy to help. Why Join Capco? Capco is a global technologyand business consultancy, focused on the financial services sector.We are passionate about helping our clients succeed in anever-changing industry. You will work on engaging projects withsome of the largest banks in the world, on projects that willtransform the financial services industry. We are/have: - Expertsin banking and payments, capital markets and wealth and assetmanagement - Deep knowledge in financial services offering,including e.g. Finance, Risk and Compliance, Financial Crime, CoreBanking etc. - Committed to growing our business and hiring thebest talent to help us get there - Focused on maintaining ournimble, agile and entrepreneurial culture As a Data Engineer atCapco you will: - Work alongside clients to interpret requirementsand define industry-leading solutions - Design and develop robust,well-tested data pipelines - Demonstrate and help clients adhere tobest practices in engineering and SDLC - Have excellent knowledgeof building event-driven, loosely coupled distributed applications- Have experience in developing both on-premise and cloud-basedsolutions - Possess a good understanding of key securitytechnologies and protocols e.g. TLS, OAuth, Encryption - Supportinternal Capco capabilities by sharing insight, experience andcredentials Why Join Capco as a Data Engineer? - You will work onengaging projects with some of the largest banks in the world, onprojects that will transform the financial services industry. -You’ll be part of a digital engineering team that develops new andenhances existing financial and data solutions, having theopportunity to work on exciting greenfield projects as well as onestablished Tier 1 bank applications adopted by millions of users.- You’ll be involved in digital and data transformation processesthrough a continuous delivery model. - You will work on automatingand optimising data engineering processes, developing robust andfault-tolerant data solutions both on cloud and on-premisedeployments. - You’ll be able to work across different data, cloudand messaging technology stacks. - You’ll have an opportunity tolearn and work with specialised data and cloud technologies towiden your skill set. Skills & Expertise: You will haveexperience working with some of the followingMethodologies/Technologies; Required Skills - Hands-on workingexperience of the Databricks platform. Must have experience ofdelivering projects which use DeltaLake, Orchestration, UnityCatalog, Spark Structured Streaming on Databricks. - Extensiveexperience using Python, PySpark and the Python Ecosystem with goodexposure to Python libraries. - Experience with Big Datatechnologies and Distributed Systems such as Hadoop, HDFS, HIVE,Spark, Databricks, Cloudera. - Experience developing near real-timeevent streaming pipelines with tools such as – Kafka, SparkStreaming, Azure Event Hubs. - Excellent experience in the DataEngineering Lifecycle, having created data pipelines which takedata through all layers from generation, ingestion, transformationand serving. - Experience of modern Software Engineering principlesand experience of creating well-tested, clean applications. -Experience with Data Lakehouse architecture and data warehousingprinciples, experience with Data Modelling, Schema design and usingsemi-structured and structured data. - Proficient in SQL & goodunderstanding of the differences and trade-offs between SQL andNoSQL, ETL and ELT. - Proven experience in DevOps and buildingrobust production data pipelines, CI/CD Pipelines on e.g. AzureDevOps, Jenkins, CircleCI, GitHub Actions etc. Desirable Skills -Experience developing in other languages e.g. Scala/Java. -Enthusiasm and ability to pick up new technologies as needed tosolve problems. - Exposure to working with PII, Sensitive Data andunderstanding data regulations such as GDPR.#J-18808-Ljbffr

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Machine Learning Jobs for Career Switchers in Their 30s, 40s & 50s (UK Reality Check)

Are you considering a career change into machine learning in your 30s, 40s or 50s? You’re not alone. In the UK, organisations across industries such as finance, healthcare, retail, government & technology are investing in machine learning to improve decisions, automate processes & unlock new insights. But with all the hype, it can be hard to tell which roles are real job opportunities and which are just buzzwords. This article gives you a practical, UK-focused reality check: which machine learning roles truly exist, what skills employers really hire for, how long retraining realistically takes, how to position your experience and whether age matters in your favour or not. Whether you come from analytics, engineering, operations, research, compliance or business strategy, there is a credible route into machine learning if you approach it strategically.

How to Write a Machine Learning Job Ad That Attracts the Right People

Machine learning now sits at the heart of many UK organisations, powering everything from recommendation engines and fraud detection to forecasting, automation and decision support. As adoption grows, so does demand for skilled machine learning professionals. Yet many employers struggle to attract the right candidates. Machine learning job adverts often generate high volumes of applications, but few applicants have the blend of modelling skill, engineering awareness and real-world experience the role actually requires. Meanwhile, strong machine learning engineers and scientists quietly avoid adverts that feel vague, inflated or confused. In most cases, the issue is not the talent market — it is the job advert itself. Machine learning professionals are analytical, technically rigorous and highly selective. A poorly written job ad signals unclear expectations and low ML maturity. A well-written one signals credibility, focus and a serious approach to applied machine learning. This guide explains how to write a machine learning job ad that attracts the right people, improves applicant quality and strengthens your employer brand.

Maths for Machine Learning Jobs: The Only Topics You Actually Need (& How to Learn Them)

Machine learning job adverts in the UK love vague phrases like “strong maths” or “solid fundamentals”. That can make the whole field feel gatekept especially if you are a career changer or a student who has not touched maths since A level. Here is the practical truth. For most roles on MachineLearningJobs.co.uk such as Machine Learning Engineer, Applied Scientist, Data Scientist, NLP Engineer, Computer Vision Engineer or MLOps Engineer with modelling responsibilities the maths you actually use is concentrated in four areas: Linear algebra essentials (vectors, matrices, projections, PCA intuition) Probability & statistics (uncertainty, metrics, sampling, base rates) Calculus essentials (derivatives, chain rule, gradients, backprop intuition) Basic optimisation (loss functions, gradient descent, regularisation, tuning) If you can do those four things well you can build models, debug training, evaluate properly, explain trade-offs & sound credible in interviews. This guide gives you a clear scope plus a six-week learning plan, portfolio projects & resources so you can learn with momentum rather than drowning in theory.