Intern: BigData Platform
Position Type: | Internship |
---|---|
Priority: | No |
Degree Requirement: | MS, PhD |
Intern: BigData Platform
Location: Santa Clara, CA
Project Background: OpenChai is an industry and academic collaborative with the mission to bring predictive analytics to the mainstream enterprise. We are integrating cutting edge open source frameworks (Apache Spark, Tachyon, Mesos, ML2, Docker and CoreOS etc.) on heterogeneous hardware (64-bit ARM, Open Power, GPUs, Xeon) to build production ready, next generation big data infrastructure. The open source community we are developing is specifically interested in enterprise use cases and work-loads.
Required Skills:
- Solid understanding of Spark, Zookeeper, Hadoop required;
- At least proficient in one of the following system-admin languages: Ruby, Python or Golang
- Hands-on experience on system administration toolset, e.g.,
o you can play with yum or apt-get with your own box;
o you know how /etc/yum.repos.d/ works;
- Deep understanding toward JVM performance tuning is a big plus
- Hands-on experience on compiler toolchains, Docker toolchains is a big plus
What we can provide: Our work is a platform for you to get exposure to stellar engineers from companies like Databricks, Dockers, CoreOS. And if you are a self-motivated eager-to-learn person, this may be one of the best platforms for you to spend your summer time.
Contact: This is a Huawei/OpenChai co-hosted internship position. Please send your resume to shuo.yang@huawei.com with the title of applying OpenChai internship.