What are the responsibilities and job description for the Sr Solutions Architect - Databricks position at Hakkōda, an IBM Company?
Introduction
At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
The Opportunity
Your role and responsibilities
We are seeking a highly motivated and technically proficient Solution Architect with deep expertise in data, a strong understanding of Generative AI, and a proven track record with Databricks to join our dynamic team. In this pivotal role, you will bridge the gap between business challenges and technical solutions, working closely with prospective clients to understand their needs, demonstrate the transformative value of our consulting services, and craft compelling technical proposals. This is an exciting opportunity for an individual who is passionate about data and AI, enjoys problem-solving, and thrives in a client-facing, consultative sales environment.
Key Responsibilities
○ Collaborate with Account Executives to qualify sales opportunities and conduct in-depth discovery sessions with prospective clients to understand their current data landscape, business objectives, pain points, and technical requirements.
○ Act as a trusted advisor, offering insights and guidance on data strategy, governance, architecture, traditional analytics, machine learning, and the strategic application of Generative AI.
○ Quarterback technical pursuits identifying when to leverage the extended team of industry and technical experts to bring the right resources to the pursuit.
■ Architecting and optimizing data pipelines and data warehouses on Databricks and cloud platforms
■ Data ingestions strategies
■ Business Intelligence and Data Visualization integrations
■ AI/ML integration and MLOps, with a strong focus on ML model training, deployment, and governance within Databricks
■ Generative AI use cases (e.g., custom LLM development, RAG implementations, prompt engineering strategies, AI agent frameworks, fine-tuning open-source models) specifically demonstrating how these can be built and deployed effectively
■ Cloud data and AI platforms (AWS, Azure, GCP), emphasizing the native integrations and optimal deployment patterns for Databricks.
○ Develop and deliver compelling technical presentations, platform demonstrations, and proof-of-concepts (POCs) demonstrating the capabilities to diverse audiences, including technical stakeholders and business leaders.
○ Translate complex technical concepts (including advanced AI/ML, GenAI, and specific Databricks features) into clear, concise, and business-relevant language.
○ Estimate effort and resource requirements for proposed data solutions, leveraging knowledge of typical workload patterns and best practices.
○ Coordinate with delivery and operations management to streamline scoping and estimation capabilities including the development of the estimation document and process
○ Develop an understanding of the IBM estimation process and tools and incorporate that into the Hakkoda framework.
○ Develop relationship with Databricks Solution engineering teams, collaborating on new solutions and product developments
○ Develop relationship with partner organizations
○ Collaborate with marketing to develop technical sales enablement content and collateral
○ Collaborate with Asset and Delivery teams to understand GenAI asset catalog and applicability to the pre-sales and delivery process.
This job may be performed from anywhere in the US
Preferred Education
Master's Degree
Qualifications
Required technical and professional expertise
○ Retrieval-Augmented Generation (RAG) architectures, with an understanding of how to implement these using vector databases integrated with Databricks.
○ Prompt engineering best practices.
○ Fine-tuning and deployment of generative models on cloud platforms, including considerations for Databricks environments.
○ Familiarity with Generative AI frameworks and libraries (e.g., LangChain, LlamaIndex, Hugging Face Transformers) in the context of Databricks.
○ Hands-on experience with cloud-based GenAI services (e.g., AWS Bedrock, Azure OpenAI Service, Google Cloud Vertex AI) and their integration with Databricks as the data foundation.
○ ETL/ELT tools and methodologies.
○ Business Intelligence and Data Visualization tools (e.g., Tableau, Power BI, Qlik Sense).
○ SQL proficiency and experience with various database systems (relational, NoSQL, vector databases).
○ Cloud data platforms (AWS, Azure, GCP) – certifications are a strong plus, especially specific Databricks certifications.
○ Data Governance, Data Quality, and Master Data Management (MDM) concepts.
○ Programming languages such as Python or Scala for data manipulation/analysis and AI model development within Databricks
Master's degree preferred
At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.
The Opportunity
Your role and responsibilities
We are seeking a highly motivated and technically proficient Solution Architect with deep expertise in data, a strong understanding of Generative AI, and a proven track record with Databricks to join our dynamic team. In this pivotal role, you will bridge the gap between business challenges and technical solutions, working closely with prospective clients to understand their needs, demonstrate the transformative value of our consulting services, and craft compelling technical proposals. This is an exciting opportunity for an individual who is passionate about data and AI, enjoys problem-solving, and thrives in a client-facing, consultative sales environment.
Key Responsibilities
- Technical Pre-Sales and Discovery:
○ Collaborate with Account Executives to qualify sales opportunities and conduct in-depth discovery sessions with prospective clients to understand their current data landscape, business objectives, pain points, and technical requirements.
○ Act as a trusted advisor, offering insights and guidance on data strategy, governance, architecture, traditional analytics, machine learning, and the strategic application of Generative AI.
○ Quarterback technical pursuits identifying when to leverage the extended team of industry and technical experts to bring the right resources to the pursuit.
- Solution Design and Presentation:
■ Architecting and optimizing data pipelines and data warehouses on Databricks and cloud platforms
■ Data ingestions strategies
■ Business Intelligence and Data Visualization integrations
■ AI/ML integration and MLOps, with a strong focus on ML model training, deployment, and governance within Databricks
■ Generative AI use cases (e.g., custom LLM development, RAG implementations, prompt engineering strategies, AI agent frameworks, fine-tuning open-source models) specifically demonstrating how these can be built and deployed effectively
■ Cloud data and AI platforms (AWS, Azure, GCP), emphasizing the native integrations and optimal deployment patterns for Databricks.
○ Develop and deliver compelling technical presentations, platform demonstrations, and proof-of-concepts (POCs) demonstrating the capabilities to diverse audiences, including technical stakeholders and business leaders.
○ Translate complex technical concepts (including advanced AI/ML, GenAI, and specific Databricks features) into clear, concise, and business-relevant language.
- Proposal Development:
○ Estimate effort and resource requirements for proposed data solutions, leveraging knowledge of typical workload patterns and best practices.
○ Coordinate with delivery and operations management to streamline scoping and estimation capabilities including the development of the estimation document and process
○ Develop an understanding of the IBM estimation process and tools and incorporate that into the Hakkoda framework.
- Market and Industry Expertise:
○ Develop relationship with Databricks Solution engineering teams, collaborating on new solutions and product developments
○ Develop relationship with partner organizations
- Cross-Functional Collaboration:
○ Collaborate with marketing to develop technical sales enablement content and collateral
○ Collaborate with Asset and Delivery teams to understand GenAI asset catalog and applicability to the pre-sales and delivery process.
- Client Relationship Management:
This job may be performed from anywhere in the US
Preferred Education
Master's Degree
Qualifications
Required technical and professional expertise
- Bachelor's degree in Computer Science, Engineering, Data Science, Information Systems, or a related technical field.
- 4 years of experience in a technical pre-sales, solutions architecture, or consulting role, specifically focused on data solutions, with a strong emphasis on AI/ML.
- Demonstrated deep technical expertise and hands-on experience with Databricks
- Good technical understanding of Generative AI concepts and technologies, including:
○ Retrieval-Augmented Generation (RAG) architectures, with an understanding of how to implement these using vector databases integrated with Databricks.
○ Prompt engineering best practices.
○ Fine-tuning and deployment of generative models on cloud platforms, including considerations for Databricks environments.
○ Familiarity with Generative AI frameworks and libraries (e.g., LangChain, LlamaIndex, Hugging Face Transformers) in the context of Databricks.
○ Hands-on experience with cloud-based GenAI services (e.g., AWS Bedrock, Azure OpenAI Service, Google Cloud Vertex AI) and their integration with Databricks as the data foundation.
- Deep technical expertise in at least several of the following areas:
○ ETL/ELT tools and methodologies.
○ Business Intelligence and Data Visualization tools (e.g., Tableau, Power BI, Qlik Sense).
○ SQL proficiency and experience with various database systems (relational, NoSQL, vector databases).
○ Cloud data platforms (AWS, Azure, GCP) – certifications are a strong plus, especially specific Databricks certifications.
○ Data Governance, Data Quality, and Master Data Management (MDM) concepts.
○ Programming languages such as Python or Scala for data manipulation/analysis and AI model development within Databricks
- Proven ability to articulate complex technical solutions to both technical and non-technical audiences.
- Excellent communication, presentation, and interpersonal skills.
- Strong problem-solving abilities and a consultative approach to sales.
- Ability to work independently and as part of a collaborative team.
Master's degree preferred