Gregor (@bygregorr)
약한 모델이 더 단순한 코드를 작성하므로, 파인튜닝된 모델이 아직 상위 수준의 추상화를 추론하지 못하면 노이즈를 학습하게 된다고 설명한다. 단순히 지능이 아니라 데이터 분포 일치도를 측정하는 것이라는 중요한 관점을 제시한다.
Gregor (@bygregorr)
약한 모델이 더 단순한 코드를 작성하므로, 파인튜닝된 모델이 아직 상위 수준의 추상화를 추론하지 못하면 노이즈를 학습하게 된다고 설명한다. 단순히 지능이 아니라 데이터 분포 일치도를 측정하는 것이라는 중요한 관점을 제시한다.
In our ongoing effort to put my courses online, we are continuing lectures for the Algorithmic Lower Bound Course.
Lesson 16: Algorithmic Lower Bounds by Mohammad Hajiaghayi: Massively Parallel Computation Lower Bounds 1
Lesson 17: Algorithmic Lower Bounds by Mohammad Hajiaghayi: Massively Parallel Computation Lower Bounds 2
Feel free to subscribe to our YouTube channel @hajiaghayi for upcoming lessons premiering every Wednesday at 7pm ET (you can access all the lectures for the Algorithmic Lower Bound course, including future sessions, in the following playlist: https://www.youtube.com/playlist...
Additionally, all lectures on Introduction to Algorithms are available in this playlist: https://www.youtube.com/@hajiaghayi/playlists)
Jan Olkowski discusses Lower Bounds in Massively Parallel Computation (MPC) models, introducing the shuffle model for understanding function computability. The lecture explores theoretical aspects, including the formal MPC model and its challenges. The second session extends to upper and lower bounds, focusing on rounds, machine limitations, and polynomials in the shuffle model. Conditional hardness assumptions lead to lower bounds for problems like maximal matching and connectivity in MPC. Ongoing research aims to enhance our understanding of solvability under these assumptions.
#MPC #ShuffleModel #LowerBounds #UpperBounds #Polynomials #Connectivity #ConditionalHardness #Algorithms #BigData #ResearchPresentation #DataDistribution