[Lldb-commits] [PATCH] D48704: [ExecutionContext] Return the target/process byte order.

Pavel Labath via Phabricator via lldb-commits lldb-commits at lists.llvm.org
Tue Aug 6 23:49:41 PDT 2019


labath accepted this revision.
labath added a comment.

Thanks for setting this up. The infrastructure is a bit overkill for a simple test like this, but hopefully this can be useful for other tests too. I have a bunch of comments, but hopefully none of them are major, so if you agree with all of them, just fire away..



================
Comment at: lldb/unittests/Target/ExecutionContextTest.cpp:40-43
+    HostInfo::Terminate();
+    FileSystem::Terminate();
+    Reproducer::Terminate();
+    platform_linux::PlatformLinux::Terminate();
----------------
call terminate in reverse order?


================
Comment at: lldb/unittests/Target/ExecutionContextTest.cpp:77
+TEST_F(ExecutionContextTest, GetByteOrderTarget) {
+  ArchSpec arch = Target::GetDefaultArchitecture();
+  arch.SetTriple("armv7-pc-apple");
----------------
I don't understand this.. Why do you fetch the ArchSpec via `Target::GetDefaultArchitecture()` and then immediately overwrite it on the next line? Can this line just be deleted?


================
Comment at: lldb/unittests/Target/ExecutionContextTest.cpp:78
+  ArchSpec arch = Target::GetDefaultArchitecture();
+  arch.SetTriple("armv7-pc-apple");
+
----------------
It looks weird to have an apple triple with a linux platform. I am assuming you use the linux platform because it can be compiled everywhere, but in that case, maybe you could use a linux triple too? Maybe also use a big-endian triple in one of the tests to catch regressions regardless of the endianness of the host?


================
Comment at: lldb/unittests/Target/ExecutionContextTest.cpp:85
+  DebuggerSP debugger_sp = Debugger::CreateInstance();
+  EXPECT_TRUE(debugger_sp);
+
----------------
change EXPECT_TRUE to ASSERT_TRUE. There's no point in continuing the test if this is false, as it will just crash anyway...


================
Comment at: lldb/unittests/Target/ExecutionContextTest.cpp:124
+  ExecutionContext process_ctx(process_sp);
+  EXPECT_EQ(process_sp->GetByteOrder(), process_ctx.GetByteOrder());
+}
----------------
It doesn't look like the presence of a process will change what ExecutionContext::GetByteOrder returns, as it checks the target first. OTOH, I don't know why would the byte order of a process be ever different from the byte order of its target...


Repository:
  rLLDB LLDB

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D48704/new/

https://reviews.llvm.org/D48704





More information about the lldb-commits mailing list