Kıdemli Veri Mimarisi Uzmanı II
Alt bilgi
"Bu ilan aracılığıyla yapacağınız başvurular kapsamında toplanacak kişisel verileriniz veri sorumlusu sıfatıyla Sahibinden Bilgi Teknolojileri Pazarlama ve Ticaret Anonim Şirketi (“Sahibinden”) tarafından 6698 sayılı Kişisel Verilerin Korunması Kanunu (“KVKK”) ve ilgili mevzuat uyarınca iş başvuru süreçlerinin yürütülmesi ve iş başvurunuzun değerlendirilmesi amaçlarıyla işlenecektir.”
“İşe alımlarımızda fırsat eşitliğine önem veririz. Çeşitlilik, Eşitlik ve Kapsayıcılık Politikamız doğrultusunda, her başvuru sürecini şeffaf ve adil bir şekilde değerlendirerek din, dil, ırk, cinsiyet ayrımı yapmaksızın her adayın yeteneklerine odaklanırız.”
Welcome sahibinden.com world!
- We work for WOW? moments
- We stay curious and explore, we love innovation
- We look into the same topics from different perspectives
- We celebrate and feel proud of our success together
S- technology;
Sahibinden.com has been helping people meet their dreams for the last 24 years.
With more than 55 million users and approximately with its 1000 employees, sahibinden.com is among the top five companies globally in classifieds sector.
Approved in 2017 as an R&D center in Turkey, sahibinden.com has been the pioneer of a variety of new services and provide continuous and reliable service over the web and mobile networks. Through our dynamic and agile teams, we continue to be a driving force in the digital marketplace by developing scalable and high-performance platforms that serve millions of users daily.
At sahibinden.com, we adopt a culture of innovation and continuous improvement at work to empower our employees. This approach, which fosters creative thinking and professional development, consistently earns us the title of one of The Best Workplaces in Europe, as voted by our employees.
Join us and be a part of our journey to redefine the future of e-commerce.
Is that you? Great!
- Bachelor's degree in Computer Science, Information Systems, Industrial Engineering or related field.
- Minimum 2 years of experience as a Data Warehouse Architect, with a focus on Vertica as the database platform.
- Proficient in using Python libraries to build and automate Data Quality checks, validation rules, data cleansing, and profiling tasks.
- Strong experience with SQL Tuning, including query optimization by investigating query execution plans, indexing strategies, and performance improvements.
- Expertise in Data Lineage management, ensuring visibility of data flows and transformations throughout the pipeline.
- Hands-on experience in building ETL/ELT pipelines and working with modern ETL tools.
- Strong proficiency in writing complex SQL queries, stored procedures, and functions in Vertica.
- Solid understanding of Data Quality frameworks and tools, with experience in implementing automated data validation and monitoring using Python.
- Excellent problem-solving skills with the ability to address complex data issues and improve system efficiency.
- Strong communication skills, with the ability to collaborate effectively with cross-functional teams and business stakeholders.
If the requirements are met, then your job description is below;
We are seeking an experienced Senior Data Warehouse (DWH) Architect with expertise on enhancement DWH-Pipelines structure, SQL Tuning and Data Quality management and to join our team for an enterprise-level Data Warehouse (DWH) project. The ideal candidate will have strong experience working with Vertica Structure and SQL Tuning for automating and optimizing ETL pipelines and Data Quality Architecture. This role will involve ensuring high-quality data integration, processing, and tracking data lineage within large-scale DWH solution.
Responsibilities:
- Design and optimize complex ETL processes for a large-scale Data Warehouse using Vertica as the primary database platform.
- Use Python to build and automate Data Quality pipelines, ensuring data accuracy, consistency, and integrity across the Data Warehouse.
- Perform SQL Tuning to optimize query performance, enhance data retrieval efficiency, and improve overall system performance.
- Implement Data Quality validation rules, data cleansing processes, and data profiling with Python and integrate them into the ETL pipeline.
- Manage and track Data Lineage on Datahub to document the flow and transformation of data across the entire pipeline, ensuring transparency and traceability.
- Collaborate with Data Analytics Specialists and Business Process Owners to understand business requirements and ensure data quality meets those needs.
- Build and maintain automated data validation and monitoring scripts using Vsql to ensure data quality at every stage of the ETL process.
- Collaborate with cross-functional teams to ensure seamless integration of data from various sources into the DWH, maintaining high performance for integration regarding CDC and Spark Data Pipelines.
- Regularly monitor data processing workflows, identifying and implementing improvements for performance and data quality.
- Provide mentorship and guidance to junior team members, ensuring adherence to coding standards, best practices, and efficient development processes.
Preferred Skills
- Expertise in managing and optimizing Vertica (e.g., partitioning, distribution keys, compression).
- Experience with Big Data technologies (e.g., Trino, Spark) is a plus.
- Knowledge of cloud platforms (GCP) and their data services (e.g., BigQuery).
- Familiarity with version control systems like Git.
What We Offer
- 2 weeks available throughout the year
- Birthday leave
- One week accommodation at Richmond Efes and Pamukkale Hotels (for employees who have completed 3 years at Sahibinden.com)
- Well-being opportunities;
- Quality life package
- Gym and Dietician service