
Website BlackRock
Job Description:
The AI Labs team works collaboratively; and is a multi-disciplinary team with the following skills and capabilities: optimization, machine learning, statistical modeling, exploratory data analysis, natural language processing, data visualization, network/graph modeling, ETL, data pipelines, data architecture, communication, project / product management and strategy. We work with data from a wide variety of sources including text, news feeds, financial reports, time series transactions, user behavior logs, imagery, and real-time data.
Job Responsibilities:
- Work with data scientists to develop data ready tools to support their job.
- You will help lead architecture on a multi-discipline, multi-region team of data scientists, engineers, and investment professionals on a corporate-wide set of client, investor, and operational problems.
- Act as lead to Identify, design, and implement internal process improvements and relay to relevant technology organization.
- You will be accountable for managing high-quality datasets exposed for internal and external consumption by downstream users and applications. Lead in the creation and maintenance of optimized data pipeline architectures on large and sophisticated data sets.
- Assemble large, complex data sets that meet BlackRock business requirements.
- Assist in the development of business recommendations with effective presentation of findings at multiple levels of partners using visual analytic displays of quantitative information. Communicate findings with partners as vital.
- Work with partners to assist in the data-related technical issues and support their data infrastructure needs.
- You will build and operationalize data pipelines to enable squads to deliver high quality data-driven product.
- Keep data separated and segregated according to relevant data policies.
- Automate manual ingest processes and optimize data delivery subject to service level agreements; work with infrastructure on re-design for greater scalability.
- Improve BlackRock’s product and services suite by crafting, growing and optimizing our data and data pipeline architecture.
Job Requirements:
- Experience with building and optimizing ‘big data’ pipelines, architectures, and data sets. Familiarity with data pipeline and workflow management tools Luigi, Airflow
- Experience with stream-processing systems: Storm, Spark-Streaming
- Advanced working SQL knowledge and experience with relational databases.
- Experience with Amazon AWS and Google Cloud Platforms
- Experience with Hadoop, Spark, and Kafka
- Experience with OO or object scripting language such as Python, Scala, and Java
- 3-5+ years of experience in a data engineer role with a BA or MS degree in a quantitative subject area (computer science, mathematics, statistics, data science, economics, physics, engineering or related field)
Job Details:
Company: BlackRock
Vacancy Type: Full Time
Job Location: San Jose, CA, US
Application Deadline: N/A
vacanciesforyou.net