GCP Data & AI Architect
Location: Belgium or The Netherlands
As a Data & AI Architect, you're a practical problem-solver who loves helping clients use data to make smarter decisions. You're great at getting to know their business and finding ways that data and AI can help them reach their goals.
You enjoy working on the whole data & AI process, from brainstorming ideas to making them a reality, always looking for new and exciting ways to make an impact.
The impact you'll make
By thoroughly understanding the customers unique challenges and objectives, you'll create tailor-made AI solutions that optimize operations, uncover valuable insights, and fuel informed decision-making.
Together, we'll transform the customers data into a strategic asset, enabling it to streamline processes, anticipate future trends, and achieve a competitive advantage in the industry.
Your commitment to collaboration and innovation ensures that you not only solve current problems but also proactively position your solutions for long-term success.
Your skills
- Proficiency in utilizing GCP's AI and ML services such as AI Platform, AutoML, BigQuery ML and Vertex AI.
- Knowledge of data storage solutions on GCP including BigQuery, Cloud Storage, and Cloud Bigtable
- Expertise in AI/ML frameworks supported on GCP such as TensorFlow, PyTorch, and scikit-learn.
- Ability to design and implement scalable data pipelines using GCP services like Dataflow and Dataproc.
What's in it for you?
- Lead innovation by designing AI and data solutions using GCP
- Drive transformative initiatives that directly influence client outcomes
- Engage in continuous (peer)learning
- Participate on a tactical level within our company
- Enjoy an open and informal company culture
- Collaborate with awesome colleagues
Meet your future colleague Casper
For my current project, I work on the Google Cloud Platform. Here I connect data from different data sources like Salesforce, MariaDB, etc. to bundle them in a modernized data warehouse. I build custom Spark ETL pipelines with DataFusion. With the Data Loss Prevention API, I encrypt sensitive data with keys to ensure security and privacy. To orchestrate these data pipelines, I use Cloud Composer (Apache Airflow). Event-triggered functionality is handled by serverless Cloud Functions and object storage in Cloud Storage. I also make technical dashboards that monitor the components and their costs in Data Studio. In the end, the data is stored in a Kimball data warehouse model in BigQuery where the data is consumed by MicroStrategy reports.
In my downtime I like to go to the movies and go cycling to train for the company’s yearly bike event. We are going to Italy for the Duchennes Heroes event, 3 days of cycling with my colleagues while supporting a good cause.
Why Agiliz
We’re a team of about 50 people, all interested in data and analytics. Fun, informal and open communication define us. We’re like a small village where everyone knows everyone and we’re always willing to lend a hand to our neighbour.
We don’t limit ourselves to one technology or one business sector, the variety in projects and technology stack in addition to our knowledge sharing culture is one of the reasons why applicants choose for us.
Although we believe that our love for team events, food and good vibes might have something to do with it as well.
Meet your future colleague Nina
My role as a Data Engineer covers a wide range of responsibilities. It starts with gathering and analyzing business requirements, followed by diving into data sources to identify opportunities that meet those needs. I then move on to the process of extracting, storing, cleaning, transforming, and modeling the data. Additionally, I provide training and coaching sessions to ensure that end users can make the most of the data solutions.
I always go the extra mile for my clients and take pride in the robust data solutions I design.
Speaking of miles, one of my favorite activities is riding my motorcycle. I'm especially excited about my upcoming two-week road trip through Tanzania!