NLP MCQ ON MODULE TEST Natural Language Processing
NLP MCQ ON MODULE TEST
Get Natural Language Processing for Free MCQs all at one place for your last-moment preparation. Cdac NLP MCQ Question Modern NLP algorithms are based on machine learning, especially statistical machine learning. The paradigm of machine learning is different from that of most prior attempts at language processing.
NLP MCQ: Improve your understanding of the fundamentals, and algorithms, ... You can build a machine learning RSS reader in less than 30 minutes using -.
1. Which of the following includes major tasks of NLP?
*
1/1
2. Choose from the following areas where NLP can be useful.
*
1/1
3. What are the main challenge/s of NLP?
*
1/1
4. The Bag-of-Words approach_________
*
1/1
5. Which step is the process of breaking down documents into smaller units of analysis? *
1/1
6. Which is a model of measuring the incidence of known words? *
1/1
7. From the sentence “Fintech Online Course”, how many bigrams can be created? *
1/1
8. Which are included in named entity recognition? *
1/1
9. Which is a collection of documents? *
1/1
10. What is the role of Stemming and Lemmatization in NLP?
*
1/1
11. What are word embeddings used for in NLP (Natural Language Processing)?
*
1/1
12. What is the main advantage of using word embeddings in NLP models?
*
1/1
13. Which of the following is true about the vector representations generated by GloVe and word2vec?
*
1/1
14. What is word2vec?
*
1/1
15. What is the primary use case of Seq2Seq models in NLP?
*
0/1
Correct answer
16. What is the role of the encoder component in a Seq2Seq model?
*
1/1
17. How does a Seq2Seq model handle the problem of variable-length input and output sequences?
*
1/1
18. What is the problem of vanishing gradients in RNNs and how is it addressed?
*
1/1
19. How is the output generated in an RNN for NLP tasks?
*
1/1
20. What is the purpose of the output gate in LSTMs?
*
1/1
21. How does the reset gate in GRUs control the flow of information from the previous hidden
*
1/1
22. What is the main difference between LSTM and GRU units in numerical computations?
*
1/1
23. What is Long Short-Term Memory (LSTM)?
*
1/1
24. What is BERT?
*
1/1
25. What is the main advantage of BERT compared to traditional word embedding techniques?
*
1/1
26. How is BERT used for NLP tasks?
*
1/1
27. What is the main advantage of transformers compared to other types of neural networks in NLP?
*
1/1
28. What is the purpose of the self-attention mechanism in transformers?
*
1/1
29. What is the difference between the encoder and decoder in a transformer-based NLP model?
*
1/1
30. What are some common applications of discourse analysis in NLP?
*
0/1
Correct answer
31. What are some common categories of named entities recognized by NER systems in NLP?
*
0/1
Correct answer
32. What is the role of the acoustic model in an ASR system?
*
1/1
33. What are the primary challenges in ASR systems?
*
0/1
Correct answer
34. How does the attention mechanism work in NLP?
*
1/1
35. What is the main difference between traditional NLP models and models with attention mechanism?
*
1/1
36. What is an example of text normalization in NLP?
*
1/1
37. What is the main goal of using SVM in NLP?
*
1/1
38. What is the most commonly used library for SVM in NLP?
*
1/1
39. What is the main advantage of using SVM in NLP?
*
1/1
40. Summarization creates new phrases paraphrasing the original source.
*
1/1
41. "I Saw The Boy With A Pony Tail ", What Type Of Ambiguity Does Sentence Have
*
1/1
42. Progmatic Analysis Is the ________ Stage in NLP?
*
1/1
43. Syntactical analysis is done at ______________ level
*
1/1
44. In NLP, The process of removing words like “and”, “is”, “a”, “an”, “the” from a sentence is called as
*
1/1
45. Famous Stemming algorithm
*
0/1
Correct answer
46. Consider the statement " The students went to class" . Assign POS tags for the statement.
*
1/1
47. What is anaphora?
*
1/1
48. Which of the following is a common method used to determine the similarity between two documents in NLP?
*
1/1
49. How is cosine similarity calculated?
*
1/1
50. What is the range of cosine similarity values?
*
1/1
51. What is the primary data structure used in spaCy for processing text?
*
1/1
52. How does spaCy handle named entity recognition (NER)?
*
1/1
53. Can spaCy be used for language model training?
*
1/1
54. What is the default tokenization method used by spaCy?
*
1/1
55. Which of the options is not a NLP application?
*
1/1
56. What is the purpose of negative sampling in the Skip-Gram model?
*
1/1
57. How does negative sampling work in the Skip-Gram model?
*
1/1
58. Suppose you have a 10000 word vocabulary, and are learning 500-dimensional word
embeddings.The GloVe model minimizes this objective:
Which of these statements are correct?
*
1/1
59. Which of these equations do you think should hold for a good word embedding?