(COMPANY NAME) is a rapidly growing VC-backed company based in London, building software to enable the safe and ethical use of valuable data for analytics and machine learning. We work with large organisations worldwide in financial services, telecommunications, pharma and government, enabling them to get the most out of data without compromising on privacy and security.
(COMPANY NAME) is pioneering the new enterprise software category of Privacy Engineering to serve this emerging business need and address a social issue of growing importance. Our technology enables organisations to safely analyse and mine sensitive datasets while protecting the individual's privacy.
You must enjoy and value working in a close-knit team with other very talented people, and have strong communication skills. You'll be positive, constructive and proactive - you'll identify problems and instead of complaining about them, you'll work with others to fix them. You constantly seek both to learn from and to help others, and to find better ways of working. You favour helping the team be successful over looking smart yourself.
Due to strong customer demand we are expanding our consulting team and have an opening for a solutions architect to lead the design of the installation architecture, integration and orchestration of (COMPANY NAME) products and customer data in strategically important customer projects.
Ideal candidates for this role will learn and adapt quickly and are comfortable being dropped quickly into challenging technical problems with the responsibility to solve them.
* Own the delivery architecture covering (COMPANY NAME) products, integration with customer components, operational data flow and orchestration.
* Maintain expertise and a detailed understanding of the operation and usage of (COMPANY NAME) products as well as being familiar with their configuration and maintenance
* Understand and satisfy customer data privacy issues and requirements (size, scope, risk issues, exposure impact), data volumes, anticipated masking complexity, performance goals and the existing operational data infrastructure (e.g. Hadoop)
* Understand and satisfy customer requirements for data separation, authorisation, information security controls and regulatory issues
* Contribute to product, privacy and technical discussions during pre-sales customer engagement
* Actively identify and act upon new opportunities identified within customer accounts
* Define the test strategy and acceptance criteria along with approaches for resilience, business continuity and backup/recovery where appropriate
* Obtain formal agreement and sign-off for solution architecture
* Travel to customer sites within Europe, and possibly further afield can be expected
* Establish strong working relationships with (COMPANY NAME) teams in our London headquarters
* Collaborate with UK teams to leverage, and continue development of, existing product and implementation architecture knowledge base and best practice and defined implementation approaches
* Bachelor's degree in Computer Science or a Science or Engineering discipline
* Proven track record of architecting sophisticated product integrations in customer enterprise environments leveraging components such Hadoop, Data Flow, enterprise security modules, RDBMS, workflow automation, ETL tools and other related technologies
* Experience of administering software applications and tools in Linux
* Experience with scripting languages (e.g. shell, python, perl)
* Experience with database schemas and SQL
* Proven ability to deliver results under pressure with rapidly evolving propositions, client demands and business needs
* You care deeply about customer success
* You enjoy the variety and fast pace of a dynamic start-up, you're flexible in your approach and comfortable with ambiguity.
* You have a good sense of humour! and think work should be fun as well as intellectually satisfying
* Detailed knowledge of Hadoop architecture including primary operational components (HDFS, YARN, Kerberos, Ambari/Hue, Hive/Impala)
* Detailed knowledge of common Data Flow / Streaming environment and technology (e.g. Apache NiFi, Kafka, Confluent, StreamSets)Operational experience of customer Hadoop deployments (Hortonworks, Cloudera)Programming experience in Java, Python or similar
* Experience of integrating with enterprise RDBMS infrastructure
* Experience with Amazon AWS and other cloud platforms
* Broad knowledge of Hadoop and Linux security infrastructure
* Experience of integrating to LDAP-based directory services for authentication and authorisation
* Experience of implementing solutions for resilience, business continuity and backup/recovery
* Exposure to data warehouses and operational data stores
* Gathering, reviewing and validating business and data requirements
* Experience with BI tools such as Tableau or Qlikview
* Publicly available posts/articles, public speaking or examples of thought leadership
(COMPANY NAME) does not accept unsolicited referrals or CVs from any source other than directly from candidates or approved agencies with written agreements in place and instructed on specified roles.
Unsolicited CVs received from any agency not engaged as outlined above will be considered a "free gift", and there will be no fees due should we choose to contact the candidate directly. Receipt of unsolicited CVs will in no way establish any prior claim to the candidate should they also be submitted by another agency. We consider this type of activity an attempt to lay claim to a given candidate and therefore entirely inappropriate. Any submission of unsolicited CVs to us will be deemed as full acceptance of these terms.
We only engage with agencies who are respectful of candidates, businesses and other agencies. We abide by our agreements with them and maintain genuine, straight-forward and lasting relationships which generate the highest calibre candidates for our business