I mean, you can add their user agent to the robots file but the crawler could just change their user agent or even ignore the robots file if the server isn’t filtering requests by user agent
I mean, you can add their user agent to the robots file but the crawler could just change their user agent or even ignore the robots file if the server isn’t filtering requests by user agent
I blame the rise of frameworks, libraries, and IDEs. It’s easier for someone who knows nothing to throw some software together and ship it.
I very much disagree with this. Yes to an extent you don’t need to know as much as you might have in the past but if we had to constantly reinvent the wheel, I don’t think we would have nearly as many people entering/remaining in this field. Additionally well written frameworks and libraries can actually make your code safer since you don’t have to reinvent the wheel and discover the pitfalls all over again. IDEs are also a net positive IMO. Errors next to the line of code that caused them, breakpoints, interactive debugging. These are all things I personally would find hard to live without. Necessities? Technically no. But good god do I not want to have to read build output unless necessary.
The API is going through code review right now!
Whoa whoa whoa, that’s Momazon, a company that resembles but is legally distinct from Amazon