The donor T61 is 8889-ACG and unfortunately it has dedicated NVS-140M GPU. Comparing to Intel X3100 integrated video it has some serious drawbacks:
- It is prone to Nvidia GPU bug with bumping material (see sticky topic for details).
- It consumes a lot of power: at idle my T61 FV consumes 14.3W with all off, backlight at lowest level, and Powermize is on max battery. While another T61 15.4” WXGA with X3100 GPU consumes only 9.5-10W. Take into account that my Flexview UXGA requires 1-1.5W more than the 15.4” WXGA, I estimated that the Nvidia GPU consumes about 4-5W at idle, while the Intel one consumes about 1-1.5W.
So my biggest concern is to lowering power consumption of Nvidia GPU. Using Nvidia Inspector and NVPMManagerUni, I set the GPU at fixed max battery level and underclock the GPU a bit, to 130Mhz. However, the power consumption was still high because the supply voltage for GPU was still 1.15v. Therefore, my next objective is to lowering the GPU voltage.
However, unlike CPU voltage, the mobile GPU voltage can not altered by software. Some PC video card can change voltage by altering the so called voltage table in its BIOS using NiBitor software, but this software does not recognize mobile video card where video BIOS is integrated with main BIOS. Therefore, the only way I can go with is to make hardware mod in the GPU power supply system.
I had found that Lenovo uses ADP3209 IC to control the voltage of GPU. This IC has 5 VID pins (like CPU power supply systems). Before going into action, I tried to overclock my GPU, and it could go as high as 650Mhz at 1.15V. So my estimation is that it can keep the default frequency of 400Mhz at about 0.90V. Based on that, I decided to give the VID3 voltage level of 1 instead of 0.
To do this mod, I removed the 0 Ω resistor R590 from the board to give high (1) signal level to pin 26 (VID3) of the ADP3209 IC. However, due to unknown at this moment reason, the voltage at this pin is still low (0). To cope with it, and due to the fact that before I accidentally remove R537, I connected pin 26 to pin 27 (VID2) and they both have high level. The GPU voltage after this mod is 0.95V for Powermize lowest performance level (before: 1.15V) and 0.90V for highest performance level (before 1.20V). You may find strange that the latter voltage is lower than the former
The laptop boots without problems, at 0.90V I even can overclock the GPU a bit, up to 450Mhz. Using Video Stability test, I had found that the GPU temperature droped by 8-10 degrees Celcius (from 74deg to 66 deg according to GPU-Z, or from 65 to 58 deg according to TPFan control). Another effect is that at idle, the laptop consumes only 13.0W instead of 14.3W before the mod. Thus, the battery life on my 9 cells 77Wh battery is extended by 10%, up to 4.5 hours comparing to 4 hours before. Hope that with lower working temperature, my Nvidia GPU will work longer
Here are images of ADP3209’s VID voltage table and this image (I don’t know this image will violate the forum rule or not. If yes, I will remove it). The actual picture of mod will be posted later.





