diff options
author | Johann <johannkoenig@google.com> | 2011-04-07 13:17:22 -0400 |
---|---|---|
committer | Johann <johannkoenig@google.com> | 2011-04-18 16:30:38 -0400 |
commit | c7cfde42a9ec05b72d15ebaa9a59cefed4cd323a (patch) | |
tree | 395d38ba42df5e8be5abe33baa028bc937226155 /vp8/encoder/x86/sad_mmx.asm | |
parent | d889035fe6802b64567c2ed250c1dff0eb377acf (diff) | |
download | libvpx-c7cfde42a9ec05b72d15ebaa9a59cefed4cd323a.tar libvpx-c7cfde42a9ec05b72d15ebaa9a59cefed4cd323a.tar.gz libvpx-c7cfde42a9ec05b72d15ebaa9a59cefed4cd323a.tar.bz2 libvpx-c7cfde42a9ec05b72d15ebaa9a59cefed4cd323a.zip |
Add save/restore xmm registers in x86 assembly code
Went through the code and fixed it. Verified on Windows.
Where possible, remove dependencies on xmm[67]
Current code relies on pushing rbp to the stack to get 16 byte
alignment. This broke when rbp wasn't pushed
(vp8/encoder/x86/sad_sse3.asm). Work around this by using unaligned
memory accesses. Revisit this and the offsets in
vp8/encoder/x86/sad_sse3.asm in another change to SAVE_XMM.
Change-Id: I5f940994d3ebfd977c3d68446cef20fd78b07877
Diffstat (limited to 'vp8/encoder/x86/sad_mmx.asm')
0 files changed, 0 insertions, 0 deletions