You’ve now seen a bunch of these token definitions, one for words, one for strings. A lexical analyzer or lexer–this is a term of art– is just a collection of such tokens. You tell me what makes a word, what makes a string, what makes a number, what makes white space. You put it all together, and the result is a lexer, something that splits a string into exactly the token definitions you’ve given it. For example, once I put these 3 rules together, they become a lexer, or lexical analyzer. And in fact, suppose we passed to this lexical analyzer the input string 33 is less than 55. Oh, gentle student, tell to me which one of these token output sequences could result. In this multiple choice quiz, indicate which of these 3 possible output lists could correspond to the values of the tokens extracted from this input string using these rules. Let’s put it all together.