Background/context + problem: Possibly not the most common use case, but I use my old Windows 2-in-1 as a tablet mostly these days and Firefox as my main browser. When I'm doing something on the tablet, mainly watching things, I like to do it in fullscreen for the extra screen real estate (and to feel more like a tablet) and have added the fullscreen button to the top bar to make it easy to go into fullscreen mode.
However, it's much less easy to come _out_ of fullscreen mode, eg to change tabs or just to check battery level, when all I have is touch. I usually end up stabbing wildly with my finger at the boundary between the screen and the bezel hoping to hit the right pixel, and then having to avoid hitting another tab in the process as the touch actually interacts with the top bar rather than just revealing it. It's clear this interaction was not designed with touch in mind, which is understandable as it's a less frequent use case, however it is frustrating.
What would make more sense is if the user swipes down from the top bezel while in fullscreen mode, the top bar appears (currently swipe does nothing). No action performed while the top bar is hidden should interact with the contents in the top bar, eg the tab shouldn't be changed if somebody swipes down on top of a tab. Maybe this interaction is also only available if the computer is in tablet mode (available in Win 10, not sure about Win 11) or otherwise interacts via touch to not conflict with mousing to the top of the screen?
This simple interaction change would make using Firefox on a 2-in-1 in tablet mode so much better, and help continue to extend the life of my old device and keep it out of landfill even as my main computing has moved to a newer machine 😊