Ask HN: Learning PySpark and Related Tools

9 points · rookie123 · 11 days ago

Hey HN,

I have been working in the data-science and machine-learning domain for the past 8 years or so. I have not been exposed to tools such as PySpark etc. which are being asked frequently in job descriptions. What resource or certification can I use to get upto par on PySpark?

Thanks!


6 comments
almosthere · 9 days ago
Having used spark for the past 8 years or so, it's definitely a solid basic for data engineering. I use it for generating reports the most, but sometimes we have large projects to get data into different staging databases. I use it a lot with ElasticSearch or a parquet. Basically it helps you write large joins and flatten the result to a database that can more quickly perform aggregations on that flattened result (like Elasticsearch) or a columnar database.
datadrivenangel · 9 days ago
If you have experience in any data frame library (like Pandas), and SQL, you can pick up PySpark pretty easily... With the one caveat that writing good data pipelines in any language gets much harder when you start looking at ways to actually processes big data (~20+TB). Modern SQL engines are so good though.
philomath_mn · 11 days ago
francocalvo · 11 days ago
I'm a Data Engineer which uses Spark daily. I guess the only important cert would come from Databricks, but I think it will be more worth your while to read the book mentioned here and try to do a little project ingesting/transforming data
hnthrowaway0315 · 11 days ago
Just get a job since you are already senior. You can learn it on the job. Find a few tutorials if you must, but people should be able to pick it up in a few weeks for basic work.