百度nlp算法工程师(实习)(J85176)
实习兼职ACG地点:上海状态:招聘
任职要求
-具备扎实的机器学习及自然语言处理相关知识,熟悉transformer、LSTM等网络架构,熟悉BERT、RoBERTa、GPT等知名模型的相关理论 -具备较强python编程基础,有基本的后台服务开发能力,熟悉任意一门知名深度学习训练框架,如Ten…
登录查看完整任职要求
微信扫码,1秒登录
工作职责
-使用和优化已有技术工具,完成数据整理,数据清洗 -数据增强策略优化,泛化 -模型训练,模型上线 -辅助完成商业项目需求交付
包括英文材料
机器学习+
https://www.youtube.com/watch?v=0oyDqO8PjIg
Learn about machine learning and AI with this comprehensive 11-hour course from @LunarTech_ai.
https://www.youtube.com/watch?v=i_LwzRVP7bg
Learn Machine Learning in a way that is accessible to absolute beginners.
https://www.youtube.com/watch?v=NWONeJKn6kc
Learn the theory and practical application of machine learning concepts in this comprehensive course for beginners.
https://www.youtube.com/watch?v=PcbuKRNtCUc
Learn about all the most important concepts and terms related to machine learning and AI.
NLP+
https://www.youtube.com/watch?v=fNxaJsNG3-s&list=PLQY2H8rRoyvzDbLUZkbudP-MFQZwNmU4S
Welcome to Zero to Hero for Natural Language Processing using TensorFlow!
https://www.youtube.com/watch?v=R-AG4-qZs1A&list=PLeo1K3hjS3uuvuAXhYjV2lMEShq2UYSwX
Natural Language Processing tutorial for beginners series in Python.
https://www.youtube.com/watch?v=rmVRLeJRkl4&list=PLoROMvodv4rMFqRtEuo6SGjY4XbRIVRd4
The foundations of the effective modern methods for deep learning applied to NLP.
Transformer+
https://huggingface.co/learn/llm-course/en/chapter1/4
Breaking down how Large Language Models work, visualizing how data flows through.
https://poloclub.github.io/transformer-explainer/
An interactive visualization tool showing you how transformer models work in large language models (LLM) like GPT.
https://www.youtube.com/watch?v=wjZofJX0v4M
Breaking down how Large Language Models work, visualizing how data flows through.
LSTM+
https://colah.github.io/posts/2015-08-Understanding-LSTMs/
Humans don’t start their thinking from scratch every second.
https://d2l.ai/chapter_recurrent-modern/lstm.html
The term “long short-term memory” comes from the following intuition.
https://developer.nvidia.com/discover/lstm
A Long short-term memory (LSTM) is a type of Recurrent Neural Network specially designed to prevent the neural network output for a given input from either decaying or exploding as it cycles through the feedback loops.
https://www.youtube.com/watch?v=YCzL96nL7j0
Basic recurrent neural networks are great, because they can handle different amounts of sequential data, but even relatively small sequences of data can make them difficult to train.
BERT+
https://www.youtube.com/watch?v=xI0HHN5XKDo
Understand the BERT Transformer in and out.
还有更多 •••
相关职位
实习
1. 参与NLP团队的研发工作,支持小爱对话系统的日常运营和性能优化; 2. 负责实现和优化NLP算法,提升产品在自然语言处理领域的性能及准确率; 3. 设计和开发相应的自然语言处理、文本挖掘、大语言模型等任务; 4. 跟进最新的学术进展,及时掌握NLP的前沿技术。
更新于 2025-06-17北京
实习
1. 参与NLP团队的研发工作,支持小爱对话系统的日常运营和性能优化; 2. 负责实现和优化NLP算法,提升产品在自然语言处理领域的性能及准确率; 3. 设计和开发相应的自然语言处理、文本挖掘、大语言模型等任务; 4. 跟进最新的学术进展,及时掌握NLP的前沿技术。
更新于 2025-07-23北京
实习
【职责描述】 1. 参与NLP团队的研发工作,支持小爱对话系统的日常运营和性能优化; 2. 负责实现和优化NLP算法,提升产品在自然语言处理领域的性能及准确率; 3. 设计和开发相应的自然语言处理、文本挖掘、预训练语言模型等任务; 4. 跟进最新的学术进展,及时掌握NLP的前沿技术。
更新于 2023-08-07北京