[PATCH] D75474: [clangd] Get rid of getTokenRange helper

Kadir Cetinkaya via Phabricator via cfe-commits cfe-commits at lists.llvm.org
Tue Mar 3 04:11:24 PST 2020


kadircet added inline comments.


================
Comment at: clang-tools-extra/clangd/XRefs.cpp:155
   const SourceManager &SourceMgr = AST.getSourceManager();
-  const FileEntry *F = SourceMgr.getFileEntryForID(SourceMgr.getFileID(TokLoc));
+  auto Tok = syntax::tokenize(syntax::FileRange(SourceMgr, TokLoc, 1),
+                              SourceMgr, AST.getLangOpts());
----------------
sammccall wrote:
> sammccall wrote:
> > sammccall wrote:
> > > so this is just running the raw lexer...
> > > 
> > > both callers have a ParsedAST, can we call spelledTokenAt()? Either here, passing in (TokenBuffer, Loc, TUPath) or in the caller, passing in (syntax::Token*, SourceMgr, TUPath).
> > It *seems* to always be the case that TokLoc is a spelled token, but this seems slightly subtle and the previous code defended against marco locations. Worth a comment
> > can we call spelledTokenAt
> 
> as you pointed out offline, we only have spelled tokens for the main file.
> I think we have to lex here to get the token length, and the clearest thing is to do this completely directly (`Lexer::measureTokenLength` -> `Loc.getLocWithOffset`) rather than obfuscating with tokenbuffer.
> We should have a comment *why* we lex here.
> so this is just running the raw lexer...
> both callers have a ParsedAST, can we call spelledTokenAt()? Either here, passing in (TokenBuffer, Loc, TUPath) or in the caller, passing in (syntax::Token*, SourceMgr, TUPath).

as discussed offline the problem is TokenBuffer won't hold spelled tokens for included files. so if you've got a macro definition or declaration coming from a header, spelledTokenAt will fail.
making use of `Lexer::MeasureTokenLength` instead.


================
Comment at: clang-tools-extra/clangd/XRefs.cpp:155
   const SourceManager &SourceMgr = AST.getSourceManager();
-  const FileEntry *F = SourceMgr.getFileEntryForID(SourceMgr.getFileID(TokLoc));
+  auto Tok = syntax::tokenize(syntax::FileRange(SourceMgr, TokLoc, 1),
+                              SourceMgr, AST.getLangOpts());
----------------
kadircet wrote:
> sammccall wrote:
> > sammccall wrote:
> > > sammccall wrote:
> > > > so this is just running the raw lexer...
> > > > 
> > > > both callers have a ParsedAST, can we call spelledTokenAt()? Either here, passing in (TokenBuffer, Loc, TUPath) or in the caller, passing in (syntax::Token*, SourceMgr, TUPath).
> > > It *seems* to always be the case that TokLoc is a spelled token, but this seems slightly subtle and the previous code defended against marco locations. Worth a comment
> > > can we call spelledTokenAt
> > 
> > as you pointed out offline, we only have spelled tokens for the main file.
> > I think we have to lex here to get the token length, and the clearest thing is to do this completely directly (`Lexer::measureTokenLength` -> `Loc.getLocWithOffset`) rather than obfuscating with tokenbuffer.
> > We should have a comment *why* we lex here.
> > so this is just running the raw lexer...
> > both callers have a ParsedAST, can we call spelledTokenAt()? Either here, passing in (TokenBuffer, Loc, TUPath) or in the caller, passing in (syntax::Token*, SourceMgr, TUPath).
> 
> as discussed offline the problem is TokenBuffer won't hold spelled tokens for included files. so if you've got a macro definition or declaration coming from a header, spelledTokenAt will fail.
> making use of `Lexer::MeasureTokenLength` instead.
as discussed offline, in case of macro locations we'll fail at getting the file path and stop there and that logic is the same.


Repository:
  rG LLVM Github Monorepo

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D75474/new/

https://reviews.llvm.org/D75474





More information about the cfe-commits mailing list