Description
Data Architect (Salesforce / Python)
Location: Remote (Client sits in San Francisco, California)
*Please note, this role is not able to offer visa transfer or sponsorship now or in the future*
Qualification:
- Need to Learn and understand Salesforce’s , Connecting and collaborating team for business requirements
- Experience in data warehouse fact and dimensional slowly changing data model
- Experience in Data Visualization
- Work will change dynamically at any point of time, so need, quick adoption and delivering skill.
Job Description:
- Deep understanding of data engineering concepts, database designs, associated tools, system components, internal processes, and architecture.
- Experience working as a technical lead/solution architect in a customer-focused team
- Experience working closely with Analytics/Data Science teams
- Experience working with data pipelines and heterogeneous data sources like pulling structured/unstructured data from multiple sources to the data lake.
- Experience with building pipelines from Rest API
- Proficiency with SQL, Bash, and Python scripting
- Must have basic knowledge of Salesforce Data Models.
- Must be able to proactively communicate status and identify risks. Should have experience in documenting important work items.
- Prior experience in AWS or GCP is a huge plus
- Experience in Snowflake is a plus
- Experience with data visualization tools (ex. Einstein Analytics, Tableau, Domo, Birst) is a plus
- Experience with the latest data tech stack like DBT is a huge plus
- Champion at Learning and advocating as SME to business teams.
Responsibility:
- Build and implement engineering pipelines/framework on top of Data Cloud to support upcoming prioritized features from Truth Profile Product Team.
- Maintain existing engineering pipelines built using Python/shell script/Data Cloud Streams and are owned by Truth Profile Team.
- Build required segments on Data Cloud to support prioritized features.
- Activate segments on various channels depending on the requirements.
- Implement and follow standard data engineering processes in TP.
- Provide support to the product or partner teams for any analysis tasks requested.
- Create and publish documentation for all the work required to build these features.
Must Have Skills
- Python
Good To Have Skills
- Airflow
- SQL
- Unix
Salary and Other Compensation:
The annual salary for this position is between $[103,130.00- 148,910.00] depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
• Medical/Dental/Vision/Life Insurance
• Paid holidays plus Paid Time Off
• 401(k) plan and contributions
• Long-term/Short-term Disability
• Paid Parental Leave
• Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
#LI-IR1 #CB #Ind123
Qualifications
Technical Skills
SNo
Primary Skill
Proficiency Level *
Rqrd./Dsrd.
1
Airflow
PL3
Desired
2
SQL
PL3
Desired
3
Python
PL3
Required
4
Unix
PL3
Desired
* Proficiency Legends
Proficiency Level
Generic Reference
PL1
The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
PL2
The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
PL3
The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
PL4
The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.