Interns Wanted
|

ABOUT US

Know more about what we do

Partnerships that create and grow

Our clients relationships go beyond technology. Enthusiasts to the core, we bring real solutions to each client’s problems through a deep understanding of their market, solution, and vision.

Product Discovery

An idea solves a problem. We help you to create that idea, build a product and scale it to be successful in your business. First we question why. Then we develop a roadmap to make sure the development is ready to begin in order to construct the MVP.

Think globally, act locally

We create products for a digital world and help clients through innovation, end-user engagement and technical quality.

CAREERS

See all we have to offer you

We develop technology solutions

Our Vision is to be a leading software solutions company for veriety of industries. We know that Customer’s growth is our growth, so we commit ourselves to help the customers in achieving their business goals. We want to be known as reliable, innovative and top quality software service provider in IT industry.

Our Mission is to enhance business growth of our customers with creative design, development and to deliver market defining high quality solutions that create value and reliable competitive advantage to customers around the globe.

Data Engineer

We are looking for an experienced Data Engineer who can take a proactive role in a self-organized and driven team where the “Innovation by All” revolution is well underway.

If you are eager to work in an agile organization, where re-prioritization happens regularly, and you are willing to drive the agenda and continually find improvements both in code and in processes, HTEC is the right place for you.

Important note: All interviews are held online as we follow the recommendations and the current situation considering coronavirus.

Key Responsibilities:

  • Participate in the design and development of Big Data applications
  • Create ETL processes and frameworks for analytics, data management, and data warehousing
  • Implement large-scale near real-time streaming data processing pipelines
  • Scaling and optimizing data systems for batch and streaming operations
  • Implement and manage the data life cycle stages

Requirements:

  • Strong coding experience in Scala, Java or Python 
  • Strong knowledge of relational and non-relational databases
  • Experience with REST/SOAP APIs
  • Experience with most common AWS services (EC2, S3, …)
  • In-depth knowledge of the Hadoop ecosystem (HDFS, Spark, HIVE)
  • Knowledge of Unix-like operating systems (shell, ssh, grep, awk)
  • Experience with Github-based development processes and scrum methodology
  • Fluent English is a must

An ideal candidate would also have:

  • Experience with streaming technologies (Kafka, Spark Streaming)
  • Experience with job scheduling tools (Jenkins, Apache Airflow)
  • Experience with building data pipelines on AWS, Azure, or Google Cloud
  • Experience with data modeling, data architecture, and data versioning
  • Experience with BI platforms and solutions (Tableau, Qlik, …)
  • Experience with Machine Learning would be a huge plus

Let's Get Started Today

Whether you're just getting started with your app idea or you're a multinational trying to evolve your brand for the future, we're ready to work with you. Reach out to us today and let's start talking about your project ideas.

NIS, SERBIA

Bulevar Nemanjica 30
office@dilig.net

CONTACT FORM

Do you have any questions? Send us a message

SOCIAL NETWORK

Find us and like, follow and watch us for the latest.