let’s make something together

Give us a call or drop by anytime, we endeavour to answer all enquiries within 24 hours on business days.

Find us

PO Box 16122 Collins Street West
Victoria 8007 Australia

Email us

info@domain.com
example@domain.com

Phone support

Phone: + (066) 0760 0260
+ (057) 0760 0560

DevOps Engineer with Data

  • By Weronika Drwila
  • 24 January 2023
  • 111 Views

We are looking for a GCP DevOps Engineer with Data / Data Engineer with experience in Big Data processing technologies and techniques to join our Krakow-based POD and lead building of our target state Data Warehouse Solution and Data models/products. Whilst core skills are listed below, our client are mainly looking for passionate people who are looking to continually improve and challenge themselves to work in a highly disciplined, verifiable manner.

.

The person will also develop, test, deploy, optimize data pipelines (databases, files in batch and stream) data models, data products, reporting views for variety of Data sources (other cloud providers and on-premises) along with its supporting CICD pipelines.

.

Job Duties

  • Review and refine, interpret and implement business and technical requirements
  • Ensure you are part of the on-going productivity and priorities using User Stories, Jira, Backlogs, etc.
  • Deliver requirements to scope, quality, and time commitments in Agile mode and practice
  • Responsible for onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products
  • Build and operate optimal data pipelines/models/products with SQL, stored procedures, indexes, clusters, partitions, triggers, etc.
  • Creating, owning, enhancing, and operating CI/CD pipelines using Git, Jenkins, Groovy and etc.
  • Deliver a data warehouse and pipelines which follow API, abstraction and ‘database refactoring’ best practice in order to support evolutionary development and continual change
  • Develop procedures and scripts for data migration, back-population and feed-to-warehouse initialization
  • Extend the solution with Data Catalogue
  • Protect the solution with Data Governance, Security, Sovereignty, Masking and Lineage capabilities
  • Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective
  • Ensure a consistent approach to logging, monitoring, error handling and automated recovery as per client’s standards
  • Fix defects and enhancements
  • Maintain good quality and up to date knowledge base, wiki and admin pages of the solution
  • Peer review of colleague’s changes
  • Speak up and help shape how we do things better

.

Essential Experience:

  • Expert in Administration and development of Traditional and Cloud Databases
  • Excellent understanding of GCP Core and Data Products, Architecting and solution design
  • Minimum 1+ years of working experience on Google Cloud Platform Development, especially in Data / ETL related projects
  • Data preparation, wrangling and refactoring skills, for example as part of a Data Science pipelines
  • IT methodology/practices knowledge and solid experience in Agile/Scrum
  • Experience in building and operating CI/CD life-cycle management Git, Jenkins, Groovy, Checkmarx, Nexus, Sonar IQ and etc.
  • Experience in Collaboration tools usage such as JIRA/Confluence/Various board types
  • BS/MS degree in Computer/Data Science, Engineering or a related subject
  • Excellent communication and interpersonal skills in English. Proficiency in verbal, listening and written English is crucial.
  • Enthusiastic willingness to rapidly and independently learn and develop technical and soft skills as needs require.
  • Strong organisational and multi-tasking skills.
  • Good team player who embraces teamwork and mutual support.
  • Interested in working in a fast-paced environment

.

Ideal Experience (following skills/experiences would be an added advantage):

  • Experience of deploying and operating Datafusion/CDAP based solutions
  • Experience in GCP based big data / ETL solutions DevOps model
  • Expertise of Java, Python, DataFlow
  • Broad experience with IT development and collaboration tools.
  • An understanding of IT Security and Application Development best practice.
  • Understanding of and interest in various investment products and life cycle and the nature of the investment banking business.
  • Experience of working with infrastructure teams to deliver the best architecture for applications.
  • Working in a global team with different cultures.
  • company client working previous experience

.

Cooperation Insights:

  • Strategic program part of 2025 vision
  • Full-time employment/contracting model
  • Flexible working hours and home office, with minimum 2 days/week presence at the office
  • No dress code and no mobile devices restrictions

.

Parent Friendly Policy:

  • Formal assistance while going on a maternity/paternity leave
  • Nursery funding
  • Nursery room
  • Family days
  • Working parent’s community

.

Development opportunities:

  • Conference and training budget
  • Language course/studies partial reimbursement
  • Safari books
  • Online trainings: LinkedIn, Coursera
  • Internal trainings
  • Transfer between projects

.

To top it off:

  • Team events and networking events
  • Tech communities and cultural communities
  • Mentoring programs
  • On-site medical consultations in the office

.

Note: Prepare your CV in English (PDF), fill in the form and apply! 🙂

Please include in your CV the following clause necessary for the recruitment process:

“I agree to the processing of personal data that I have made available voluntarily in the recruitment process by the Administrator of personal data, i.e. Dotcommunity Spółka z ograniczoną odpowiedzialnością [Ltd.] based in Cracow, 15 Żabiniec Street, 31-215 Cracow, registered in Poland, the Cracow’s District Court – Śródmieście, XI Commercial Division of the National Court Register under number 0000468484, VAT number: 9452174499, (“Dotcommunity”) in order to carry out the recruitment process for the DevOps Engineer position on the basis of Art.6 item 1a of the Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)”.

.

Dotcommunity jest zarejestrowana w Rejestrze agencji zatrudnienia (KRAZ) pod numerem 9904.

    * - required