Information on Service 'GAVO Data Center TAP service'

The GAVO Data Center's TAP end point. The Table Access Protocol (TAP) lets you execute queries against our database tables, inspect various metadata, and upload your own data.

In GAVO's data center, we in particular hold several large catalogs like PPMXL, 2MASS PSC, USNO-B2, UCAC4, WISE, SDSS DR7, for you to use in crossmatches, possibly with uploaded tables.

Tables exposed through this endpoint include: nucand from the amanda schema, main from the annisred schema, data from the antares schema, data from the antares10 schema, dr10 from the apass schema, frames from the apo schema, main from the applause schema, gfh, id, identified, master, nid, unidentified from the arigfh schema, main from the arihip schema, main from the auger schema, data, phot_all, ssa_time_series from the bgds schema, data from the boydende schema, cat from the browndwarfs schema, cubes, fluxposv1200, fluxposv500, fluxv1200, fluxv500, objects, spectra from the califadr3 schema, images, srccat from the cars schema, meta from the carsarcs schema, data from the danish schema, main from the dfbsplates schema, spectra, ssa from the dfbsspec schema, main from the dmubin schema, main from the emi schema, data from the feros schema, fk6join, part1, part3 from the fk6 schema, data from the flare_survey schema, data, ordersmeta from the flashheros schema, data from the fornax schema, dr1, dr2_ts_ssa, dr2epochflux, dr2light from the gaia schema, data from the gcpms schema, main from the gdr2dist schema, main, photometry from the gdr2mock schema, generated_data, maglim_5, maglim_6, maglim_7, main, parsec_props from the gedr3mock schema, columns, services, tables from the glots schema, main from the gps1 schema, main from the hdgaia schema, data from the hiicounter schema, main from the hipparcos schema, main from the hppunion schema, main from the hsoy schema, nucand from the icecube schema, data from the inflight schema, obscore from the ivoa schema, events, photpoints, timeseries from the k2c9vst schema, plates from the kapteyn schema, katkat from the katkat schema, data from the lamost5 schema, ssa_lrs, ssa_mrs from the lamost6 schema, geocounts, measurements, stations from the lightmeter schema, rawframes from the liverpool schema, main from the lspm schema, plates, wolfpalisa from the lsw schema, main from the maclega schema, rawframes from the maidanak schema, exts from the mcextinct schema, cubes, slitspectra from the mlqso schema, epn_core, mpcorb from the mpc schema, main, stars from the mwsc schema, main, stars from the mwsce14a schema, data from the obscode schema, bibrefs, maps, masers, monitor from the ohmaser schema, data, shapes from the openngc schema, main from the pcc schema, data from the plc schema, data from the plc2 schema, data from the plts schema, data from the polcatsmc schema, plates, rawplates from the potsdam schema, cubes, maps from the ppakm31 schema, data from the ppmx schema, main, usnocorr from the ppmxl schema, map10, map6, map7, map8, map9, map_union from the prdust schema, dr2, dr3, dr4, main from the rave schema, images, photons from the rosat schema, alt_identifier, authorities, capability, interface, intf_param, registries, relationship, res_date, res_detail, res_role, res_schema, res_subject, res_table, resource, stc_spatial, stc_spectral, stc_temporal, subject_uat, table_column, validation from the rr schema, objects, photpar from the sasmirala schema, sources from the sdssdr7 schema, main from the smakced schema, main from the spm4 schema, sources from the supercosmos schema, columns, groups, key_columns, keys, schemas, tables from the tap_schema schema, main from the taptest schema, main from the tgas schema, data from the theossa schema, epn_core from the titan schema, data from the toss schema, data from the twomass schema, icrscorr, main, ppmxlcross from the ucac3 schema, main from the ucac4 schema, main from the ucac5 schema, main from the urat1 schema, data, platecorrs, plates, ppmxcross, spurious, twomasscross from the usnob schema, data from the veronqsos schema, stripe82 from the vlastripe82 schema, archives, main from the wfpdb schema, main from the wise schema, data from the zcosmos schema.

For a list of all services and tables belonging to this service's resource, see Information on resource '__system__/tap'

Service Documentation

This service speaks TAP, a protocol designed to allow the exchange of queries and data between clients (that's normally something running on your computer) and servers (e.g., us).

You will want to use some sort of client to query TAP services; examples for those include:

You can, in a pinch, use our service in an XML-enabled browser, too. Under Overview, look for the bullet point on tap and follow the link to "this service". Then, click on "New job..." in the job list, enter your query, click "Set query", then "Execute query". Reload the page you are redirected to now and then to see when your job is finished, then retrieve the result.

The queries this service executes are written an a dialect of SQL called ADQL. You need to learn it to use this service. See our ADQL tutorial. Also do not miss the local examples.

By the way, for quick ad-hoc queries from within a browser, our ADQL form service may be more convenient than TAP.

Also see the table metadata of the tables exposed here.

Local information: This service may put a much lower limit on uploads in sync queries than advertised in the table metadata. Unfortunately, current TOPCAT versions don't give terribly helpful error messages when the server rejects a query because the attached upload is too large (something like “Error writing request body to server“). If that happens to you, just switch to asynchronous querying (in TOPCAT, that's in the “Mode” combo box just above the query editor).

Issues

For information on our ADQL implementation, see the ADQL service info.

If multiple output columns in a query would end up having the same name, in the output VOTable they are disambiguated by appending underscores. This is in violation of the specification, but since fixing it would require rather intrusive changes into our software and it is not clear why one should want to use names when they are not unique to begin with, this will probably not be fixed.

Overview

You can access this service using:

This service is published as follows:

local means it is listed on our front page, ivo_managed means it has a record in the VO registry.

VOResource XML (that's something exclusively for VO nerds)