The simple way to do this is to make a bounding box with longitude and latitude coordinates and then just do intersection testing from a view bounding box. However as you get farther from the equator, the bounding boxes get progressively more distorted and you can be either over or under inclusive in your datasets.
My solution is to use steradians, or solid angles. All you need is a normal vector and an angle, four floating point numbers which is the same as a bounding box, actually I add another to speed up calculation but I'll explain that later. It performs uniformly anywhere on the sphere, and to specify global coverage you just set the angle to 180 degrees.
Now all you need to do is to calculate the steradian coverage for a dataset, and one for the visible camera volume and you can do inclusion testing with just five multiplies, three adds and a compare. It may be a little more work than a bounding box intersection test, but not much more and I think that it is an acceptable tradeoff for the properties that it provides.
It's simple really, the dot product between steradian normals gives you cosine of the angle between them. Intersection is indicated by an angle less than the sum of each steradian's coverage angle, or:
dot(normal_a, normal_b) >= cos(angle_a + angle_b)
There is a nice formula for the cosine of a sum:
cos(a + b) = cos(a)cos(b) - sin(a)sin(b)
So if you store the sine and cosine of the angle rather than the angle you get:
dot(normal_a, normal_b) >= cos_a*cos_b - sin_a*sin_b
Simple, fast and effective. I haven't found a non O(N^2) algorithm to determine the minimum steradian on a dataset yet. It is fast and easy to use lat/lon bounding boxes, but you are going to have the same problems in higher and lower latitudes with over-coverage so it is best to calculate the minimum steradian for each dataset. At least you only have to do it once on static data.