Costco Openings, Costco Remote Positions – VacancyGlobal
Job Overview
-
Date PostedSeptember 29, 2023
-
Expiration date--
Job Description
The Data Engineer – Costco Logistics BI is accountable for the cease-to-cease statistics pipelines to strengthen Costco Logistics reporting. This position is centered on statistics engineering to construct and supply computerized statistics pipelines from a plethora of inner and outside statistics sources. The Data Engineer will companion with product owners, and engineering and statistics platform groups to layout, construct, test, and automate statistics pipelines that might be relied upon throughout the employer because of the unmarried supply of truth.
<strong>ROLE</strong>
<ul>
<li>Develops and operationalizes statistics pipelines to make statistics to be had for consumption (Costco Logistics BI).</li>
<li>Works with statistics architects and statistics/BI engineers to layout statistics pipelines and recommends ongoing optimization of statistics garage, statistics ingestion, statistics best, and orchestration.</li>
<li>Designs, develops, and implements ETL/ELT methods with the use of IICS (Informatica Cloud).</li>
<li>Uses MySQL to enhance and accelerate the transport of our statistics merchandise and offerings.</li>
<li>Uses Azure offerings consisting of Azure SQL DW (Synapse), ADLS, Azure Event Hub, and Azure Data Factory to enhance and accelerate the transport of our statistics merchandise and offerings.</li>
<li>Implements massive statistics and NoSQL answers through growing scalable statistics processing structures to force high-price insights to the organization.</li>
<li>Identifies, designs, and implements inner system improvements: automating guide methods, and optimizing statistics transport.</li>
<li>Identifies approaches to enhance statistics reliability, efficiency, and best of statistics management.</li>
<li>Communicates technical standards to non-technical audiences each written and verbal.</li>
<li>Performs peer critiques for different statistics engineers work.</li>
</ul>
<strong>REQUIRED</strong>
<ul>
<li>3+ years experience engineering and operationalizing statistics pipelines with big and complicated datasets.</li>
<li>2+ years experience with Informatica PowerCenter.</li>
<li>2+ years experience with Informatica IICS.</li>
<li>2+ years experience operating with Cloud technologies; consisting of ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB, and different massive statistics technologies.</li>
<li>2+ years experience with Data Modeling, ETL, and Data Warehousing.</li>
<li>2+ years experience imposing statistics integration strategies consisting of event/message primarily based totally integration (Kafka, Azure Event Hub), ETL.</li>
<li>2+ years experience with Git / Azure DevOps.</li>
<li>Extensive experience operating with numerous statistics sources; SQL, SQL Server database, flat files (CSV, delimited), Web API, XML.</li>
<li>Advanced SQL skills; Understanding of relational databases, commercial enterprise statistics, and the capacity to put in writing complicated SQL queries towards quite a few statistics sources.</li>
<li>Strong expertise in database garage standards; Data Lake, Relational Databases, NoSQL, Graph, and Data Warehousing.</li>
<li>Able to work in a fast-moving agile improvement environment.</li>
</ul>
<strong>Recommended</strong>
<ul>
<li>Microsoft Azure/comparable certifications.</li>
<li>Experience handing over statistics answers via agile software program improvement methodologies.</li>
<li>Experience with PowerShell, Python or comparable scripting language.</li>
<li>Experience with UC4 Job Scheduler.</li>
<li>Exposure to the retail industry.</li>
<li>Excellent verbal and written communique skills.</li>
<li>BA/BS in Computer Science, Engineering, or equal software program/offerings experience.</li>
</ul>
<strong>Pay Ranges:</strong>
<ul>
<li>Level 1 – $75,000 – $110,000</li>
<li>Level 2 – $100,000 – $135,000</li>
</ul>