Senior Test Automation Engineer Worldwide
Scrapinghub is a fast growing and diverse technology business turning web content into useful data through use of our open source projects, such as the Scrapy web crawling framework.
We’re a globally distributed team of over 100 Shubbers who are passionate about scraping, web crawling and data science.
As a new Shubber, you will:
Become part of a self-motivated, progressive, multi-cultural team.
Have the opportunity to work remotely.
Have the opportunity to go to conferences and meet with the team from across the globe.
Get the chance to work with cutting-edge open source technologies and tools.
About the Job:
QA is an important function within Scrapinghub. The QA team works to ensure that the quality and usability of the data scraped by our web scrapers meets and exceeds the expectations of our enterprise clients.
Are you passionate about data and data quality and integrity?
Do you enjoy using Python to automate testing, analyze data, and speed up manual processes?
Are you highly customer-focused with excellent attention to detail?
Due to growing business and the need for ever more sophisticated QA, we are looking for a talented Senior Test Automation Engineer with substantial experience in Python to join our team.
As a Scrapinghub Engineer, you will build automated test frameworks and ad hoc test scripts to assist in the verification and validation of data quality.
Due to business requirements, candidates must be based in a European or U.S. time zone.
- Understand customer web scraping and data requirements and map these to automated tests.
- Analyze gaps in test coverage and bridge gaps with appropriate automated solutions (full-blown automated test frameworks and ad-hoc scripts) in Python.
- Work under minimal supervision and collaborate effectively with Head of QA, Project Managers, and Developers to realize your test automation deliverables
- Draw conclusions about data quality by producing (using Python or other technologies) basic descriptive statistics, summaries, and visualisations.
- Leverage Scrapinghub proprietary Continuous Integration systems (or build your own) to ensure that automated tests get executed for each spider execution and data delivery.
- Beyond Python-based test automation, proactively suggest and take ownership of improvements to QA processes and methodologies by employing other technologies and tools
- BS degree in Computer Science, Engineering or equivalent.
- Demonstrable Python programming knowledge and experience, minimum of 3 years (please provide code samples in your application, via a link to GitHub or other publicly-accessible service).
- Experience in developing automated test frameworks in Python.
- Minimum 5 years in a Software Test, Software QA, or Software Development role, in Agile, fast-paced environment and projects.
- You have been the lead Test Automation Engineer (or Team Lead with hands-on automation responsibilities) in at least one role or in at least one high importance project in one of your previous roles.
- Solid grasp of web technologies and protocols (HTML, XPath, JSON, HTTP, CSS etc.); experience in developing tests against HTTP/REST APIs.
- Strong knowledge of software QA methodologies, tools, and processes.
- Ability to formulate complex SQL queries (or experience in emulating these in Python with libraries like Pandas, PySpark etc)
- Excellent level of written and spoken English; confident communicator; able to communicate on both technical and non-technical levels with various stakeholders on all matters of QA; adept at training non-technical colleagues on test automation execution
- Knowledge and experience of Scrapy and other Python-based scraping frameworks a distinct advantage.
- Prior experience in a Data QA role (where the focus was on verifying data quality, rather than testing application functionality).
- Interest in and flair for Data Science concepts as they pertain to data analysis and data validation (machine learning, inferential statistics etc.); if you have ideas, mention them in your application.
- Knowledge of and experience in other technologies that support a modern cloud-based software service (Linux, AWS, Docker, Spark, Kafka etc.)
- Previous remote working experience.