Big Data Architect/Strategist Job in Pleasanton, California US

Big Data Architect/Strategist

As the Enterprise Architect and Strategist for Big Data Analytics at Kaiser Permanente, you will be responsible for defining the strategic roadmap to enable Kaiser Permanente to harvest the benefits of the Big Data paradigm and framework. You will work closely with both the KP Business and IT teams to drive the value proposition and adoption of Big Data Analytics. You will also drive the definition of the ecosystem of technology and service partners to define the end-to-end capability and functional matrix to enable Big Data Analytics. Finally, you will help inform the solution architecture and deployment strategy, including the roadmap to enable the key capabilities as defined in the Big Data Strategic Framework. Essential Functions: Develop Big Data Strategy and Framework for KP, including the key capabilities and functions delivered by the underlying supporting architecture, and the analytics capabilities and functions delivered to the Analyst/Data Scientist (e.g. Natural Language Processing and Understanding, Semantic Search etc). Demonstrate analytical and problem solving skills, particularly those that apply to a BigData environment, to help translate Use Cases in the various Lines of Business at KP, for both analytical and operational needs Help analytical teams push the boundaries of how to use information retrieval, machine learning, computational linguistics, matrix and graph algorithms, unsupervised clustering datamining to solve their business problems Drive innovation in team by encouraging alternative approaches and solutions Define and work with key vendors and service partners to provide technology recommendations, as well as influence Vendor Strategic Roadmaps Possess demonstrated ability to navigate very senior-level relationships (CTOs, CIOs, SVPs). Make presentations to both informed senior leadership and to get consensus for decisions Qualifications: Basic Qualifications: 15+ years experience in large-scale software development, design and architecture Strong cross-functional technical background, excellent written/oral communication skills, and a willingness and capacity to expand leadership and technical skills. 1-2 years hands-on experience with theBigDatastack (e.g.MapReduce, Hadoop, Sqoop, Pig, Hive, Hbase, Flume) Hands-on experience with productionalizing'BigDataapplications (e.g. administration, configuration management, monitoring, debugging, and performance tuning) 3-4 years hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Experience with ETL (Extract-Transform-Load) tools (e.g Informatica, Ab Initio, Datastage), BI tools (e.g. Cognos, Business Objects), Analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout), and high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, AsterData, Vertica) Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems Knowledge of NoSQL platforms (e.g. key-value stores, graph databases, RDF triple stores) Solid understanding of Natural Language Processing and Understanding, and Semantic-based Search