Add option to use datetime objects for timestamp/datetime columns.
This is listed last in my order of preference because I imagine there could be many possible options like this, and it feels wrong to have to pass an option to not throw an exception when these values are encountered.
This could be really interesting to try, and much better for performance with string and (out-of-bounds) timestamp columns. From https://uwekorn.com/2020/02/25/fletcher-status-report.html, it sounds like we'd probably be using FletcherChunkedArray, since I think we are converting each page to a pyarrow array and concatting them at the end.
This PR fixes the problem when converting query results to Pandas with `pyarrow` when data contains timestamps that would fall out of `pyarrow`'s nanoseconds precision.
The fix requires `pyarrow>=1.0.0`, thus it only works on Python 3.
### PR checklist
- [x] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
- [x] Ensure the tests and linter pass
- [x] Code coverage does not decrease (if any source code was changed)
- [x] Appropriate docs were updated (if necessary)