These kinds of APIs need to be called via a user gesture and cannot be called programmatically in your code.
For example, calling requestFullScreen in your component’s init method will not work because it was not initiated by a user gesture. Calling requestFullScreen in the click event callback on a button will work because the button is clicked by a user.
I would check your code to ensure that toFullScreen is only being called as a result of a user gesture.
Looking at the code here, toFullScreen is not being called as a result of a user gesture, so I would not expect it to work consistently.
Some webviews/browsers implement this differently, but most give you a “grace period” of up to 1 second to have user gestures propagate through your code. For example, the webview on iOS 15 gives you a 1 second time limit to propagate user gestures through requestAnimationFrame: New WebKit Features in Safari 15 | WebKit
Removing as much async code as possible and ensuring this fullscreen call is a direct result of users gestures is going to be important here to get this working consistently.
Thanks you so much. Now I understand a bit more. Its what you say. When I click a button and flip phone it works. It seems like requestFullscreen api thinks that is a gesture. When I click, wait 2 seconds, its not work. I will figured it out how to make it work efficently. Maybe I need to use a plugin for video.