[LLVMbugs] [Bug 13062] New: Objective-C encoding of int[42] parameter doesn't match encoding of int[], int* parameters

bugzilla-daemon at llvm.org bugzilla-daemon at llvm.org
Fri Jun 8 15:46:26 PDT 2012


http://llvm.org/bugs/show_bug.cgi?id=13062

             Bug #: 13062
           Summary: Objective-C encoding of int[42] parameter doesn't
                    match encoding of int[], int* parameters
           Product: clang
           Version: trunk
          Platform: Macintosh
        OS/Version: MacOS X
            Status: NEW
          Severity: enhancement
          Priority: P
         Component: -New Bugs
        AssignedTo: unassignedclangbugs at nondot.org
        ReportedBy: arthur.j.odwyer at gmail.com
                CC: llvmbugs at cs.uiuc.edu
    Classification: Unclassified


These two Objective-C methods compile to the same assembly code, but have
different type-encodings (which you can see by dumping the Mach-O binary using
"otool -oV").

    -(void)foo: (int[42]) array_parameter;
    -(void)bar: (int[])   pointer_parameter;
    -(void)baz: (int*)    pointer_parameter;

In the first case, the type-encoding is `v24 at 0:8[42i]16`. In the second and
third cases, the type-encoding is `v24 at 0:8^i16`. 

In Objective-C, as in C and C++, the two functions *do* have identical type
signatures from the compiler's point of view; you can use (int*) in the
declaration and (int[42]) in the definition, or vice versa, with no diagnostic.
sizeof(array_parameter) is still the same as sizeof(pointer_parameter). In
other words, array parameters seem to decay to pointers, as in C, *except* in
the one case of type-encodings.

The encoding stored in the Mach-O binary depends on the parameter type
specified in the implementation; the type specified in the interface
declaration is ignored as far as I can tell.

Is there some intentional significance to the difference between the
type-encodings of (int[42]) and (int*) ?  Or, is there some unintentional
significance; e.g. can any Objective-C guru come up with a program that behaves
differently depending on the style of its parameter declarations?  Or, is this
just an odd implementation detail with no side effects visible to the user?

As a consumer of type-encoding information, I would love for all int-pointer
parameters (regardless of what syntax their declarations used) to be encoded as
`^i`.

-- 
Configure bugmail: http://llvm.org/bugs/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.



More information about the llvm-bugs mailing list