Here you can see a copy of the notice in Google Search Console.
It’s a good thing if your site has been blocking Googlebot from accessing those files as you know about it so you can find solutions to deal with the problem. You can easily fix this issue by editing your site’s robots.txt file. Go ahead with this fix if you’re comfortable editing that file
Glance through the robots.txt file for any of the following lines of code:
Remove those lines if you see any of them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.
Running your site through Google’s Fetch and Render tool is the next step. This will verify whether or not you solved the issue. The tool will provide further instructions on changes to be made to the robots.txt file if Googlebot is still being blocked. Into the bargain, if there are any other crawling issues you can identify those buy using the robots.txt testing tool in Search Console.
Some are getting alerts for third party resources that are blocked; however, Google has previously said third party resources are not an issue since they are generally outside of the webmaster’s control.