Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LFX Mentorship Autumn 2024 Pretest - for #126 #130

Open
hsj576 opened this issue Aug 9, 2024 · 3 comments
Open

LFX Mentorship Autumn 2024 Pretest - for #126 #130

hsj576 opened this issue Aug 9, 2024 · 3 comments
Labels
lifecycle/past activity Past activities for future reference, not valid any more

Comments

@hsj576
Copy link
Member

hsj576 commented Aug 9, 2024

Introduction

For those who want to apply for LFX mentorship for #126, this is a selection test for the application. This LFX mentorship aims to implement cloud-edge collaborative speculative decoding based on KubeEdge-Ianvs, an open source cloud-edge collaborative distributed machine learning platform, so as to further improve the LLM inference speed in cloud-edge environment. Based on Ianvs, we designed this pretest to evaluate the candidates.

Requirements

Each applicant of LFX Mentorship can try out the task and gain a score according to completeness. Finally, the one with the highest score will successfully become the mentee of this LFX Mentorship project. All the titles of task output such as pull requests (PRs) should be prefixed with LFX Mentorship Autumn 2024 Pretest.

Task

Content

  1. Create a new example on Kubeedge Ianvs for LLM benchmark.
    • You can choose any LLM benchmark dataset to use, such as MT-bench, Humaneval, etc.
  2. Implement a cloud-edge collaborative algorithm based on the LLM benchmark example.
    • You can choose to implement any cloud-edge collaborative algorithm, including but not limited to joint-inference, speculative decoding, etc.
  3. Submit an experiment report as a PR which includes algorithm design, experiment results and a README document.

Resources

Rating

  1. Each item will be scored based on completion and code quality, and the sum of the scores will be the total score for this task.
  2. For the examples that can not be run successfully directly through the submitted code and the README instruction document, the total score will be 0 in content 1, 2. So, be cautious about the code and docs!
Item Score
Content 1 50
Content 2 25
Content 3 25

Deadline

According to the timeline of LFX mentorship Autumn 2024, the admission decision deadline is August 27th, 5:00 PM PDT. Since we have to process the internal review and decide, the final date for PR submissions of the pretest will be August 24th, 5:00 PM PDT.

@MooreZheng MooreZheng added the activity event that would be ended in certain deadlines label Aug 13, 2024
@aryan0931
Copy link
Contributor

hello @hsj576 I am facing this issue while installing ianvs, please suggest some possible solutions

Screenshot 2024-08-23 at 3 11 56 PM

@FuryMartin
Copy link
Contributor

Hi, I have implemented a demo. Please see details at here

@Sid260303
Copy link

hello @hsj576 I am facing this issue while installing ianvs, please suggest some possible solutions

Screenshot 2024-08-23 at 3 11 56 PM

Hey @hsj576 , I am having similar issues but having ModuleNotFoundError: No module named 'colorlog'

@MooreZheng MooreZheng added lifecycle/past activity Past activities for future reference, not valid any more and removed activity event that would be ended in certain deadlines labels Aug 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lifecycle/past activity Past activities for future reference, not valid any more
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants