A multimodal tactile dataset of affective touch gestures captured with a 3‑axis force sensor array. Sixteen participants performed 9 affective touch gestures (circular stroke, constant touch, heavy pat, back & forth rub, poke, isometric rub, tickle, twist, wide pinch) on a shear‑force sensor pad with 11×11 taxels at 43 Hz, yielding 7,315 usable trials (~51 trials per gesture per participant, ~2 seconds each). Each trial is a 3‑channel spatiotemporal recording, including normal (perpendicular) force, shear‑x, and shear‑y, supporting analysis of both magnitude and direction of applied force across the contact patch. Data are distributed as per‑trial NumPy .npz files organized by gesture type, with arrays arr_0 (normal), arr_1 (shear‑x), and arr_2 (shear‑y), each of shape (frames, 11, 11).
Intended use: gesture recognition, affective computing, touch biometrics, and interpretable machine learning on high‑dimensional sensor data.