Voodooprojects
Chameleon => Bug Reports => Topic started by: BladeRunner on September 07, 2009, 08:01:06 PM
-
When I use device property strings my 512 Mb GeForce 8400 GS graphics card works just fine and is seen as:
nVidia GeForce 8400 GS:
Chipset Model: nVidia GeForce 8400 GS
Type: GPU
Bus: PCIe
Slot: Slot-1
PCIe Lane Width: x1
VRAM (Total): 512 MB
Vendor: NVIDIA (0x10de)
Device ID: 0x06e4
Revision ID: 0x00a1
ROM Revision: nVidia GeForce 8400 GS OpenGL Engine [EFI]
Displays:
IBM G96:
Resolution: 1600 x 1200 @ 75 Hz
Pixel Depth: 32-Bit Color (ARGB8888)
Main Display: Yes
Mirror: Off
Online: Yes
Rotation: Supported
If I remove the device properties string and add GraphicsEnabler=Yes, my screen resolution is locked at 1024x768 and the system profile shows a generic display.
I used the Wait-Yes to see the initial Chameleon messages. The messages displayed had the correct PCI address for the graphics and ethernet cards.
I ran some tests. I ran ioreg while booted using the device properties string. Then, I rebooted after removing the device properties string and adding the GraphicsEnabler option. I ran the same ioreg command again.
ioreg -f -l -p IODeviceTree > ioreg-enabled.txt
and then ran a diff command.
One thing is clear. The ioreg with the device properties is vastly different than when I use the GraphicsEnabler option.
Can you suggest any other steps I can take to help resolve the issue? Would a different ioreg command be better?
I have included the two ioreg outputs along with the diff file. Hope it is useful.
Edit: Solved in another thread. There is a solution for this problem posted in this thread
http://forum.voodooprojects.org/index.php/topic,608.0.html
Many thanks to BuildSmart
-
dump the device-properties, use gfxutil to convert it from a .hex to a .plist , and then do a diff that makes more sense :P
-
dump the device-properties, use gfxutil to convert it from a .hex to a .plist , and then do a diff that makes more sense :P
I did as you requested.
I don't mean to be overly dense, but I don't understand running a diff between a plist file and an ioreg dump.
Result attached.
-
well.. you dump the device-properties in both cases and you compare.. :)
-
OK! It is obvious to me that we are using the same words and I am understanding it differently than you intend. I apologize for that.
I understand how to extract the hex string from my Boot.plist and use gfxutil to convert it to a plist format.
What is confusing me is that I only have one hex string that I am using. When I boot using it everything works. When I remove it and replace it with the Chameleon option "GraphiceEnabler=Yes" the display is locked at 1024x768 and thesystem profiler shows just a generic vga display capability. In that case I don't have a hex string in the Boot.plist to dump and convert with gfxutil.
What I could do in each case is use "ioreg -n display@0 -p IODeviceTree" which would produce an output like this
| | +-o pci-bridge@9 <class IOPCIDevice, id 0x0, registered, matched, active, busy 0 (3340 ms), retain 10>
| | +-o display@0 <class IOPCIDevice, id 0x0, registered, matched, active, busy 0 (3326 ms), retain 16>
| | | {
| | | "NVPM" = <01000000000000000000000000000000000000000000000000000000>
| | | "VRAM,totalsize" = <00000020>
| | | "NVCAP" = <04000000000003000c0000000000000700000000>
| | | "assigned-addresses" = <1000048200000000000000b20000000000000001140004c200000000000000e0000000000000001$
| | | "driver-ist" = <203e280401000000203e280402000000203e280403000000>
| | | "IOInterruptSpecifiers" = (<1500000007000000>,<0000000000000100>)
| | | "model" = <"nVidia GeForce 8400 GS">
| | | "AAPL,iokit-ndrv" = <6015ba34>
| | | "AAPL,gart-width" = 64
| | | "device_type" = <"NVDA,Parent">
| | | "IOInterruptControllers" = ("io-apic-0","IOPCIMessagedInterruptController")
which I could then compare with diff.
Is that closer to what you want?
-
You have your device-properties set under the /efi node, so you can get the hex of it by doing:
ioreg -lw0 -p IODeviceTree | grep device-properties
-
Using your command the hex string I extrace from the ioregestry is identical to the string in my boot.plist and, through gfxutil, generates an xml file identical to the one I generated to build the string in the first place.
However, when booted using only the Chameleon GraphiceEnabler option the string generated by your command causes this result using gfxutil;
root InstallPatch # ./gfxutil -n -s ioreg-DP-GFX.hex ioreg-DP-GFX.xml
gfxutil(282) malloc: *** mmap(size=4286517248) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
read_binary: invalid binary data
./gfxutil: cannot parse gfx data from hex input file 'ioreg-DP-GFX.hex'!
and no xml file is generated. the result is the same with or without the "-n -s" or with or without using "-i hex -o xml" operands.
I am attaching a side by side diff report of ioreg output taken from each boot session. The first few lines are comments added by me to show how the report and ioreg dumps were taken. Search for "display@" to see the section relating to the graphics card. This is the best compare I could get.
-
post the output of:
gfxutil -f display
-
post the output of:
gfxutil -f display
lrh InstallPatch $ ./gfxutil -f Display
DevicePath = PciRoot(0x0)/Pci(0x2,0x0)
I am reasonably sure this is the address of the integrated Intel graphics which I don't use.
-
Try
?lspci
at the boot prompt, and see if there is anything wrong.
-
I tried the command. There were a list of the pci addresses. I saw the one produced by the gfxutil -f command and lower down the address of the ethernet card and the nvidia display. All the addresses looked normal to me and there were no error messages.