-
Notifications
You must be signed in to change notification settings - Fork 1
/
questions.json
1 lines (1 loc) · 5.19 KB
/
questions.json
1
{"questions": [{"question": "What is the LLAMA2 70b model?", "choices": ["A large language model released by Meta.ai", "A web interface for using language models", "An open weights model owned by OpenAI", "A neural network architecture"], "answer": "A large language model released by Meta.ai", "question_script": "What is the LLAMA2 70b model?", "correct_answer_script": "You got it! The LLAMA2 70b model is a large language model released by Meta.ai.", "incorrect_answer_script": "Actually, the LLAMA2 70b model is a large language model released by Meta.ai. It is a 70 billion parameter model that is part of the LLAMA series of language models. It is an open weights model, meaning that the weights, architecture, and a paper were all released by Meta, so anyone can work with this model easily by themselves.", "sentence_in_transcription_before_asking": "So for example, working with the specific example of the LLAMA2 70b model, this is a large language model released by Meta.ai, and this is basically the LLAMA series of language models, the second iteration of it, and this is the 70 billion parameter model of this series."}, {"question": "What is the main task of the neural network in the LLAMA2 70b model?", "choices": ["To compress a large chunk of the internet into a zip file", "To predict the next word in a sequence", "To run the model inference process", "To train the model using a GPU cluster"], "answer": "To predict the next word in a sequence", "question_script": "What is the main task of the neural network in the LLAMA2 70b model?", "correct_answer_script": "You got it! The main task of the neural network in the LLAMA2 70b model is to predict the next word in a sequence.", "incorrect_answer_script": "Actually, the main task of the neural network in the LLAMA2 70b model is to predict the next word in a sequence. The neural network takes in a sequence of words and uses the parameters dispersed throughout the network to predict the next word in the sequence with a certain probability.", "sentence_in_transcription_before_asking": "So what is this neural network really doing, right? I mentioned that there are these parameters. This neural network basically is just trying to predict the next word in a sequence."}, {"question": "What is the process of obtaining the parameters for the LLAMA2 70b model?", "choices": ["Model inference", "Model training", "Model compression", "Model deployment"], "answer": "Model training", "question_script": "What is the process of obtaining the parameters for the LLAMA2 70b model?", "correct_answer_script": "You got it! The process of obtaining the parameters for the LLAMA2 70b model is called model training.", "incorrect_answer_script": "Actually, the process of obtaining the parameters for the LLAMA2 70b model is called model training. This is a computationally involved process that involves compressing a large chunk of the internet using a GPU cluster. The parameters obtained through model training can be thought of as a zip file of the internet, but with lossy compression.", "sentence_in_transcription_before_asking": "So how do we get the parameters and where are they from? Because whatever is in the run.c file, the neural network architecture and sort of the forward pass of that network, everything is algorithmically understood and open and so on, but the magic really is in the parameters, and how do we obtain them?"}, {"question": "What is the difference between the LLAMA2 70b model and other language models like ChatGPT?", "choices": ["The LLAMA2 70b model is a closed weights model, while ChatGPT is an open weights model", "The LLAMA2 70b model is an open weights model, while ChatGPT is a closed weights model", "The LLAMA2 70b model is a web interface, while ChatGPT is a neural network architecture", "The LLAMA2 70b model is a neural network architecture, while ChatGPT is a web interface"], "answer": "The LLAMA2 70b model is an open weights model, while ChatGPT is a closed weights model", "question_script": "What is the difference between the LLAMA2 70b model and other language models like ChatGPT?", "correct_answer_script": "You got it! The LLAMA2 70b model is an open weights model, meaning that the weights, architecture, and a paper were all released by Meta, so anyone can work with this model easily by themselves. On the other hand, ChatGPT is a closed weights model, meaning that the model architecture is owned by OpenAI and you're only allowed to use the language model through a web interface, but you don't have actual access to the model.", "incorrect_answer_script": "Actually, the LLAMA2 70b model is an open weights model, meaning that the weights, architecture, and a paper were all released by Meta, so anyone can work with this model easily by themselves. On the other hand, ChatGPT is a closed weights model, meaning that the model architecture is owned by OpenAI and you're only allowed to use the language model through a web interface, but you don't have actual access to the model.", "sentence_in_transcription_before_asking": "For example, if you're using ChatsGPT or something like that, the model architecture was never released. It is owned by OpenAI, and you're allowed to use the language model through a web interface, but you don't have actually access to that model."}]}