Big Data Java Development
Big data java engineer(greenplum / cassandra)
san ramon ,ca
• bachelor's degree in computer science, information technology or equivalent (stem) with minimum 8 -10+ years of experience as data engineer.
• a minimum of 2 year of experience using hadoop ecosystem, map-reduce, spark, nosql (hbase, mongodb etc.),
• a minimum of 2 year of experience using scripting (pig, python, perl, etc) is required
• a minimum of 5 years of experience java, web services, rest api
• advanced degrees preferred.
• primary role in recent positions must be as a lead data/big data engineer
• ability to leverage data assets to respond to complex questions that require timely answers
• must have superior communication skills - both oral and written
• must be able to function productively in an ambiguous environment+3 Other Responses
Software development firm looking for a mongodb architect/engineer for a short term review and assessment of newly developed web based software product. Candidate would conduct a thorough review of existing schema model and deliver recommendations for optimizing data retrieval performance and ability to scale.+3 Other Responses
• determines database structural requirements by analyzing client operations, applications, and programming; evaluating current systems; and preparing for migration to mongodb.
• hands on experience in nosql (especially mongodb) based system and hosted in azure like cloud environment
• has a passion for big data technologies and a flexible, creative approach to problem solving.
• develops database solutions by designing proposed system; defining database physical structure and functional capabilities, security, back-up, and recovery specifications.
• bachelor’s degree (computer science or related) and beyond.+3 Other Responses
Big Data Migration
Enterprise big data migration to data lakes & transformation roadmap
· industry criteria: technology
· domain: research and analytics
· function: analytics
· geographic focus: us
· previous experience/ past organizations worked with: apple, samsung, microsoft
· years of experience: 10 – 15
· key specialisation: big data
· relevant keywords: big data migration
· certification: product (gbs,ipts,lss,mdm)
· key responsibilities:
ü understanding of big data ; prior hands on knowledge of migration of big data from data warehouses and allied sources into smaller relevant data lakes.
Experience in enterprise big data migration to data lakes & transformation roadmap
· duration of engagement: 15 days+8 Other Responses
We're looking to connect with a mongodb expert to review our design and help optimise for large data sets. We would also need assistance on a number of different use cases as well.+10 Other Responses
Big Data Analytics
I am currently recruiting for my client, a it service provider/consulting for a bigdata/hadoop architect to join their project in hudson, oh / palo alto, ca / new york, ny / chicago, il / boston, ma / phoenix, az / atlanta, ga / louisville, ky (across usa), need to be us citizen, green card, ead or with valid h1b or tn visa. Please feel free to contact me for more info or forward my details to someone suitable in your network. Thank you!+3 Other Responses
I need help creating a sharded cluster. I have the replica sets created (2 + 1 arbiter) with one config server (mongod) and one routing server (mongos). I am having issues connecting the sharded replica set to the router.+2 Other Responses
We are looking for a subject matter expert on big data who can volunteer to deliver a lecture / interactive session to our academic affiliates / corporate partners. We are one of the worlds largest professional bodies for engineers.We are provide knowledge sharing platform to engineering colleges in india and globally to achieve their goals in key areas of access to knowledge, international recognition and networking, establish a strong industry connect and provide highly quality technical education.+13 Other Responses
I'm using mongodb as my database. I have an record of 100 hospitals. To store all the information i'm designing schema of my database for that i've three solutions:
i can create collection for each hospital and store data in it.
Problem: i need to fetch records from all the hospital at once so i think it is very complex.
I can create only one collection and create hospital wise embedded document in it.
Problem: document size grows but document is limited in size.For a particular query need to read all documents.
I can create only one collection and add a information like hospital id in it.
Problem: collection size increases. For a particular query need to read all documents.
Please tell me which one is better solution from them. If any other solution is possible please tell me.+5 Other Responses
Silicon valley start-up searching for mongo db expert that can help us with the design and coding of a revolutionary e-commerce offering world-wide.+5 Other Responses
Bioinformatics & Big Data Investments
For a current project, we are trying to understand the funding landscape and investment trends in the bioinformatics/big data space.
We are only interested in shared industry learnings rather than confidential information. In exchange for your contribution we would be happy to compensate you for your time. If you're interested, kindly reply with your contact information (e.G. Email, phone, skype) and we will connect with you as soon as possible.
Thank you in advance, and i look forward to hearing from you.
alina strickler+13 Other Responses
Telecommunication Big Data
Looking for a big data consultant in telecommunication space (geo-location and network analytics), to assess, recommend and implement a roadmap for big data.+9 Other Responses