Pan Xinhui
|
b193049375
locking/pv-qspinlock: Use cmpxchg_release() in __pv_queued_spin_unlock()
|
9 years ago |
Waiman Long
|
08be8f63c4
locking/pvstat: Separate wait_again and spurious wakeup stats
|
9 years ago |
Peter Zijlstra
|
64a5e3cb30
locking/qspinlock: Improve readability
|
9 years ago |
Wanpeng Li
|
229ce63157
locking/pvqspinlock: Fix double hash race
|
9 years ago |
Peter Zijlstra
|
e37837fb62
locking/atomic: Remove the deprecated atomic_{set,clear}_mask() functions
|
9 years ago |
Waiman Long
|
32d62510f9
locking/pvqspinlock: Enable slowpath locking count tracking
|
9 years ago |
Waiman Long
|
eaff0e7003
locking/pvqspinlock: Move lock stealing count tracking code into pv_queued_spin_steal_lock()
|
9 years ago |
Waiman Long
|
cd0272fab7
locking/pvqspinlock: Queue node adaptive spinning
|
9 years ago |
Waiman Long
|
1c4941fd53
locking/pvqspinlock: Allow limited lock stealing
|
9 years ago |
Waiman Long
|
45e898b735
locking/pvqspinlock: Collect slowpath lock statistics
|
9 years ago |
Waiman Long
|
d78045306c
locking/pvqspinlock, x86: Optimize the PV unlock code path
|
9 years ago |
Waiman Long
|
93edc8bd77
locking/pvqspinlock: Kick the PV CPU unconditionally when _Q_SLOW_VAL
|
10 years ago |
Waiman Long
|
75d2270280
locking/pvqspinlock: Only kick CPU at unlock time
|
10 years ago |
Will Deacon
|
3b3fdf10a8
locking/pvqspinlock: Order pv_unhash() after cmpxchg() on unlock slowpath
|
10 years ago |
Peter Zijlstra
|
0b792bf519
locking: Clean up pvqspinlock warning
|
10 years ago |
Waiman Long
|
cba77f03f2
locking/pvqspinlock: Fix kernel panic in locking-selftest
|
10 years ago |
Peter Zijlstra
|
b92b8b35a2
locking/arch: Rename set_mb() to smp_store_mb()
|
10 years ago |
Waiman Long
|
52c9d2badd
locking/pvqspinlock: Replace xchg() by the more descriptive set_mb()
|
10 years ago |
Waiman Long
|
a23db284fe
locking/pvqspinlock: Implement simple paravirt support for the qspinlock
|
10 years ago |