Mastering Touch and Gesture Interactions in React
Touch and gesture interactions are nowadays fairly standard, and users have come to expect them. This article will show you how to easily add them to any React website.
Discover how at OpenReplay.com.
Developers may easily integrate touch and gesture interactions into React apps using the flexible package React-use-gesture
. This technology provides a wide range of tools to build clear and interesting user interfaces. It revolutionizes the user experience by improving the interactivity of online apps using swipe movements and drag-and-drop capability.
React-use-gesture
makes complex gesture recognition easier, empowering developers to create intuitive, gesture-rich user interfaces that increase user engagement and add more life and interactivity to web apps. In this tutorial, I’ll demonstrate how and why you should utilize React-use-gesture
.
Why use React-use-gesture?
React-use-gesture
is a powerful library that significantly enhances the development of interactive and responsive user interfaces in React applications. Its abstraction of complex gesture handling makes it easier to integrate touch and mouse gestures like drag-and-drop and pinch-to-zoom. Developers may easily integrate a variety of hooks, including useDrag
, usePinch
, and useWheel
, with React components by utilizing React-use-gesture
, which promotes cleaner and more manageable code.
In accordance with React’s principles, this framework encourages declarative gesture-based interactions. Because of its flexibility and simplicity, developers can concentrate on creating dynamic and captivating user experiences instead of dealing with as much cognitive stress.
Whether they are creating sophisticated interfaces, slides, or picture galleries, React-use-gesture
allows developers to easily include gestures in their applications, making them more responsive and intuitive. This library is a great addition to any React developer’s toolset because it is especially useful for projects requiring complex gesture controls without tedious event processing.
Setting up the project
To get started, we will create a new project using Vite. Enter the command below:
npm create vite@latest
Once created and set up, we must import the React-use-gesture
package for proper functionality. To do this, input the command below in your command line:
npm install @use-gesture/react
Once installed, head into your IDE and import the package.
For this project, we will be using the useGesture
hook. We can use multiple gestures with this hook instead of calling numerous hooks. We will import some dummy images we stored in our public folder
directly below this hook, and as you can see below, we will arrange these images in an array.
import { useState } from "react";
import { useGesture } from "@use-gesture/react";
const images = [
"/images/Nft-1.jpg",
"/images/Nft-2.png",
"/images/Nft-3.jpg",
"/images/Nft-4.jpg",
"/images/Nft-5.jpg",
"/images/Nft-6.jpg",
"/images/Nft-7.png",
"/images/Nft-8.jpg",
];
Adding some gestures to the project
React-use-gestures
consist of hooks, which you can add to your components to perform different gestures. You can explore all the available hooks and get creative.
For this project, we will be using the useGesture
, and in it, we will be adding some gestures such as onDrag
, onWheel
, onHover
, onScroll
, onPinch
, and onMove
as seen below:
const [position, setPosition] = useState([0, 0]);
const [hovered, setHovered] = useState(false);
const handlers = useGesture({
onDrag: ({ offset: [x, y] }) => {
setPosition([x, y]);
},
onWheel: ({ delta: [, dy] }) => {
// Implement your logic for zooming or other actions with the wheel event
setPosition((prevPosition) => [prevPosition[0], prevPosition[1] - dy]);
},
onHover: ({ hovering }) => {
setHovered(hovering);
},
onScroll: ({ delta: [, dy] }) => {
// Implementing logic for additional scroll functionality
setPosition((prevPosition) => [prevPosition[0], prevPosition[1] - dy]);
},
onPinch: ({ offset: [s], memo }) => {
// Implementing pinch gesture logic for scaling
const scale = memo * s;
setPosition((prevPosition) => [
prevPosition[0] * scale,
prevPosition[1] * scale,
]);
return s;
},
onMove: ({ offset: [mx, my] }) => {
// Implementing logic for additional move functionality
setPosition([mx, my]);
},
});
As seen above, two states were created, the position
state holding two values in an array and the hovered
state, which has a boolean, false.
Next, we configured the gesture handlers for the component using the useGesture
hook. This handler consists of six gestures, each with a specific purpose, which will be discussed below.
Adding the onDrag gesture
const handlers = useGesture({
onDrag: ({ offset: [x, y] }) => {
setPosition([x, y]);
},
});
The above code destructures the offset
property from the event data received by the onDrag
handler, representing the user’s drag gesture. It extracts x
and y
values indicating horizontal and vertical offsets. The setPosition
function then updates the element’s position using these offsets, allowing it to move according to the user’s drag action.
As seen in the image above, when a user drags a picture, the onDrag
event is started. The position
state is dynamically updated through the use of the drag offset
, allowing for seamless picture shifting. This motion improves user engagement with the image gallery by providing a natural drag-and-drop experience.
Adding the onWheel gesture
const handlers = useGesture({
onWheel: ({ delta: [, dy] }) => {
// Implement your logic for zooming or other actions with the wheel event
setPosition((prevPosition) => [prevPosition[0], prevPosition[1] - dy]);
},
});
The code above triggers the onWheel
event when the user scrolls the mouse wheel. It destructures the delta
property from the event data, representing the scroll distance. Scrolling vertically is typically the intended action, so it extracts the dy
(vertical delta) value.
Then, the setPosition
function updates the position state by modifying the previous position’s vertical component (y
). This allows the element to move up or down in response to vertical scrolling.
As seen, when the mouse wheel scrolls, the onWheel
event reacts, allowing for features like zooming. The zoom effect is produced simply by changing the position
depending on the vertical scroll direction (dy)
. This makes browsing the picture gallery a smooth and visually appealing user experience.
Adding the onHover gesture
const handlers = useGesture({
onHover: ({ hovering }) => {
setHovered(hovering);
},
});
This code manages the onHover
event, triggered when the user hovers over an element. It extracts the hovering property from the event data, indicating whether the element is being hovered over (true) or not (false).
The setHovered
function updates the hovered state based on the hovering value. This allows the application to track whether the user is hovering over an element.
As seen, the onHover
event provides visual feedback to show whether an image is currently being hovered and updates the hovered state. This small but useful gesture improves the user experience by offering responsive hints when interacting with specific gallery images.
Adding the onScroll gesture
const handlers = useGesture({
onScroll: ({ delta: [, dy] }) => {
// Implementing logic for additional scroll functionality
setPosition((prevPosition) => [prevPosition[0], prevPosition[1] - dy]);
},
});
This code handles the onScroll
event when the user scrolls within a scrollable element. It extracts the delta
property from the event data, representing the amount of scrolling. As scrolling vertically is the typical behavior, it only considers the dy
(vertical delta) value.
The setPosition
function is then called to update the position
state. It adjusts the vertical position (y-coordinate)
of the element based on the scrolling direction. If the user scrolls down (positive dy)
, the element moves upward, and vice versa. This ensures the element’s position
aligns with the scrolling action, providing a responsive user experience.
As seen above, users can interact with the graphics by scrolling thanks to the onScroll
gesture, which offers a scroll functionality. The logic dynamically modifies the image positions by capturing the direction of scrolling. Users can scroll through the gallery naturally due to this smooth scrolling experience.
Adding the onPinch gesture
const handlers = useGesture({
onPinch: ({ offset: [s], memo }) => {
// Implementing pinch gesture logic for scaling
const scale = memo * s;
setPosition((prevPosition) => [
prevPosition[0] * scale,
prevPosition[1] * scale,
]);
return s;
},
});
This code manages the onPinch
event, which occurs when the user performs a pinch gesture, typically used for zooming or scaling. It extracts the offset property from the event data, which contains the pinch gesture’s scale factor (s).
The function calculates the new scale by multiplying the current scale (memo)
with the scale factor obtained from the pinch
gesture. Then, it updates the position
state by scaling both the horizontal (x)
and vertical (y)
components of the previous position
by the new scale factor.
Finally, it returns the scale factor for potential further processing. This allows the element to scale proportionally based on the pinch gesture, enabling zooming functionality.
The picture gallery now has pinch capability thanks to the onPinch
gesture. Pinch motions, similar to touch device zooming, are now possible for users.
By adjusting the visual scale, this action creates a dynamic zoom effect. This function improves the visual study of pictures, particularly when a closer inspection is needed. Most people use this feature on their mobile phones.
Adding the onMove gesture
const handlers = useGesture({
onMove: ({ offset: [mx, my] }) => {
// Implementing logic for additional move functionality
setPosition([mx, my]);
},
});
This code handles the onMove
event, triggered when the user moves an element. It destructures the offset
property from the event data, representing the movement distance along the horizontal (mx)
and vertical (my)
axes.
The setPosition
function then updates the position
state using the new horizontal and vertical offsets obtained from the move gesture. This allows the element to be repositioned according to the user’s movement.
More move functionality is introduced by the onMove
gesture as seen above. The pictures react to user interaction by repositioning themselves in response to mouse movements and touch input. This makes the gallery interactive and dynamic, enabling visitors to interact in ways other than just dragging. This functionality is ideal for developing websites with moveable elements or containers, like a gaming webpage with graphics or objects.
Then, inside the return
statement, a container div
with a wrapper is created to hold the images:
<div className="container">
<div className="wrapper">
{images.map((image, index) => (
<div key={index} className="image-container" {...handlers()}>
<img
style={{
transform: `translate(${position[0]}px, ${position[1]}px)`,
backgroundColor: hovered ? "lightblue" : "white", // Change background color on hover
}}
src={image}
alt={`Image ${index + 1}`}
/>
</div>
))}
</div>
</div>;
The image.map
function, seen above, generates an image-container
for every image by iterating over an array of images. The gesture handlers are applied to each image container using the {...handlers()}
spread operator.
Each image container contains the img
element. The pictures can be moved using the style
property to apply a transform based on the position
state. Furthermore, depending on the hovered
state, the background color becomes “light blue” when an image is hovered over.
Together, these gestures provide a more engaging user experience that accommodates a range of interaction preferences. The logic can be further modified by developers to meet the needs of a particular project, resulting in an attractive, incredibly versatile, and user-friendly picture gallery.
Conclusion
By learning how to use react-use-gesture
to master touch and gesture interactions, we have mastered equipment that allows us to construct dynamic, user-friendly applications. This article gives developers the tools they need to create user-friendly, responsive interfaces that enhance user experience and increase the potential applications of interactive web packages.
https://github.com/Chuksy25/Image-gallery.git
Gain control over your UX
See how users are using your site as if you were sitting next to them, learn and iterate faster with OpenReplay. — the open-source session replay tool for developers. Self-host it in minutes, and have complete control over your customer data. Check our GitHub repo and join the thousands of developers in our community.