Data pipeline

Data flow from source to target is an art. A plethora of tools and methodologies require an in-depth yet wide understanding. We lead customers in their journey to conquer these challenges.

Technologies

We understand data. A data pipeline is a series of data processing steps. The data is ingested at the beginning of the pipeline, and then there are a series of steps, in which each step delivers an output that is the input to the next step.

Architecture and design

  • Business understanding

  • Solution architecture

  • Choosing the right tools

  • Detailed design

  • Cost management

Workflow management

  • Authoring

  • Scheduling

  • Monitoring

Serverless platforms

  • AWS Lambda

  • Azure functions

  • GCP cloud functions

Data stores

  • Relational databases

  • NoSQL

  • Text search

  • Caching and in-memory

  • File systems and file formats

Data science

  • Data acquisition

  • Data understanding

  • Data preparation

  • Modeling

  • Deployment

Methodologies

  • Parallel computing data grids

  • Big data warehouses

  • Queues and message brokers

  • Streaming

  • Visualization analytics and BI

ETL and integration platforms

  • Enterprise platforms

  • Open source platforms

  • Cloud platforms

Visualization analytics and BI

  • Tableau

  • Qlik

  • Power BI

  • Kibana

Cloud vendors

  • AWS - Amazon

  • AZURE - Microsoft

  • GCP - Google

  • OCI - Oracle

Contact us

5 Shraga Fridman street, Tel Aviv, Israel

Subscribe for updates

Follow us

  • Facebook
  • LinkedIn

© 2020 Copyright

‏‏brillix Logo