With PHP we build a media content website want to rebuild it based on React+Redux+react-router-redux and an API. SEO is crucial for us and we want to cover especially Google, Bing, Baidu, Yahoo and Yandex, of which the last 4 don't run JS on crawling AFAIK. We are currently thinking of checking the user agent of the client and render React on the server if it is a crawler and render it on the client if it is not. Does this count as cloaking? In practice, the content should be identical.
What are the best practices for this and wish to know more about googles ways of handling JS.