Azure Data Engineer (Databricks)

慨正橡扯
London
4 weeks ago
Applications closed

Related Jobs

View all jobs

Azure Data Engineer - Insurance Firm – London – hybrid working

Azure Data Engineer (AI / ML)

Azure Data Engineer

Data Engineer - Python & Azure

Data Engineer (Databricks & Azure) - Clean Energy

Data Engineer - Remote (UK ONLY) - Outside IR35

Azure Data Engineer (Databricks) Joining Capco meansjoining an organisation that is committed to an inclusive workingenvironment where you’re encouraged to #BeYourselfAtWork. Wecelebrate individuality and recognize that diversity and inclusion,in all forms, is critical to success. It’s important to us that werecruit and develop as diverse a range of talent as we can and webelieve that everyone brings something different to the table – sowe’d love to know what makes you different. Such differences maymean we need to make changes to our process to allow you the bestpossible platform to succeed, and we are happy to cater to anyreasonable adjustments you may require. You will find the sectionto let us know of these at the bottom of your application form oryou can mention it directly to your recruiter at any stage and theywill be happy to help. Why Join Capco? Capco is a global technologyand business consultancy, focused on the financial services sector.We are passionate about helping our clients succeed in anever-changing industry. You will work on engaging projects withsome of the largest banks in the world, on projects that willtransform the financial services industry. We are/have: - Expertsin banking and payments, capital markets and wealth and assetmanagement - Deep knowledge in financial services offering,including e.g. Finance, Risk and Compliance, Financial Crime, CoreBanking etc. - Committed to growing our business and hiring thebest talent to help us get there - Focused on maintaining ournimble, agile and entrepreneurial culture As a Data Engineer atCapco you will: - Work alongside clients to interpret requirementsand define industry-leading solutions - Design and develop robust,well-tested data pipelines - Demonstrate and help clients adhere tobest practices in engineering and SDLC - Have excellent knowledgeof building event-driven, loosely coupled distributed applications- Have experience in developing both on-premise and cloud-basedsolutions - Possess a good understanding of key securitytechnologies and protocols e.g. TLS, OAuth, Encryption - Supportinternal Capco capabilities by sharing insight, experience andcredentials Why Join Capco as a Data Engineer? - You will work onengaging projects with some of the largest banks in the world, onprojects that will transform the financial services industry. -You’ll be part of a digital engineering team that develops new andenhances existing financial and data solutions, having theopportunity to work on exciting greenfield projects as well as onestablished Tier 1 bank applications adopted by millions of users.- You’ll be involved in digital and data transformation processesthrough a continuous delivery model. - You will work on automatingand optimising data engineering processes, developing robust andfault-tolerant data solutions both on cloud and on-premisedeployments. - You’ll be able to work across different data, cloudand messaging technology stacks. - You’ll have an opportunity tolearn and work with specialised data and cloud technologies towiden your skill set. Skills & Expertise: You will haveexperience working with some of the followingMethodologies/Technologies; Required Skills - Hands-on workingexperience of the Databricks platform. Must have experience ofdelivering projects which use DeltaLake, Orchestration, UnityCatalog, Spark Structured Streaming on Databricks. - Extensiveexperience using Python, PySpark and the Python Ecosystem with goodexposure to Python libraries. - Experience with Big Datatechnologies and Distributed Systems such as Hadoop, HDFS, HIVE,Spark, Databricks, Cloudera. - Experience developing near real-timeevent streaming pipelines with tools such as – Kafka, SparkStreaming, Azure Event Hubs. - Excellent experience in the DataEngineering Lifecycle, having created data pipelines which takedata through all layers from generation, ingestion, transformationand serving. - Experience of modern Software Engineering principlesand experience of creating well-tested, clean applications. -Experience with Data Lakehouse architecture and data warehousingprinciples, experience with Data Modelling, Schema design and usingsemi-structured and structured data. - Proficient in SQL & goodunderstanding of the differences and trade-offs between SQL andNoSQL, ETL and ELT. - Proven experience in DevOps and buildingrobust production data pipelines, CI/CD Pipelines on e.g. AzureDevOps, Jenkins, CircleCI, GitHub Actions etc. Desirable Skills -Experience developing in other languages e.g. Scala/Java. -Enthusiasm and ability to pick up new technologies as needed tosolve problems. - Exposure to working with PII, Sensitive Data andunderstanding data regulations such as GDPR.#J-18808-Ljbffr

Get the latest insights and jobs direct. Sign up for our newsletter.

By subscribing you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Portfolio Projects That Get You Hired for Machine Learning Jobs (With Real GitHub Examples)

In today’s data-driven landscape, the field of machine learning (ML) is one of the most sought-after career paths. From startups to multinational enterprises, organisations are on the lookout for professionals who can develop and deploy ML models that drive impactful decisions. Whether you’re an aspiring data scientist, a seasoned researcher, or a machine learning engineer, one element can truly make your CV shine: a compelling portfolio. While your CV and cover letter detail your educational background and professional experiences, a portfolio reveals your practical know-how. The code you share, the projects you build, and your problem-solving process all help prospective employers ascertain if you’re the right fit for their team. But what kinds of portfolio projects stand out, and how can you showcase them effectively? This article provides the answers. We’ll look at: Why a machine learning portfolio is critical for impressing recruiters. How to select appropriate ML projects for your target roles. Inspirational GitHub examples that exemplify strong project structure and presentation. Tangible project ideas you can start immediately, from predictive modelling to computer vision. Best practices for showcasing your work on GitHub, personal websites, and beyond. Finally, we’ll share how you can leverage these projects to unlock opportunities—plus a handy link to upload your CV on Machine Learning Jobs when you’re ready to apply. Get ready to build a portfolio that underscores your skill set and positions you for the ML role you’ve been dreaming of!

Machine Learning Job Interview Warm‑Up: 30 Real Coding & System‑Design Questions

Machine learning is fuelling innovation across every industry, from healthcare to retail to financial services. As organisations look to harness large datasets and predictive algorithms to gain competitive advantages, the demand for skilled ML professionals continues to soar. Whether you’re aiming for a machine learning engineer role or a research scientist position, strong interview performance can open doors to dynamic projects and fulfilling careers. However, machine learning interviews differ from standard software engineering ones. Beyond coding proficiency, you’ll be tested on algorithms, mathematics, data manipulation, and applied problem-solving skills. Employers also expect you to discuss how to deploy models in production and maintain them effectively—touching on MLOps or advanced system design for scaling model inferences. In this guide, we’ve compiled 30 real coding & system‑design questions you might face in a machine learning job interview. From linear regression to distributed training strategies, these questions aim to test your depth of knowledge and practical know‑how. And if you’re ready to find your next ML opportunity in the UK, head to www.machinelearningjobs.co.uk—a prime location for the latest machine learning vacancies. Let’s dive in and gear up for success in your forthcoming interviews.

Negotiating Your Machine Learning Job Offer: Equity, Bonuses & Perks Explained

How to Secure a Compensation Package That Matches Your Technical Mastery and Strategic Influence in the UK’s ML Landscape Machine learning (ML) has rapidly shifted from an emerging discipline to a mission-critical function in modern enterprises. From optimising e-commerce recommendations to powering autonomous vehicles and driving innovation in healthcare, ML experts hold the keys to transformative outcomes. As a mid‑senior professional in this field, you’re not only crafting sophisticated algorithms; you’re often guiding strategic decisions about data pipelines, model deployment, and product direction. With such a powerful impact on business results, companies across the UK are going beyond standard salary structures to attract top ML talent. Negotiating a compensation package that truly reflects your value means looking beyond the numbers on your monthly payslip. In addition to a competitive base salary, you could be securing equity, performance-based bonuses, and perks that support your ongoing research, development, and growth. However, many mid‑senior ML professionals leave these additional benefits on the table—either because they’re unsure how to negotiate them or they simply underestimate their long-term worth. This guide explores every critical aspect of negotiating a machine learning job offer. Whether you’re joining an AI-focused start-up or a major tech player expanding its ML capabilities, understanding equity structures, bonus schemes, and strategic perks will help you lock in a package that matches your technical expertise and strategic influence. Let’s dive in.