Information on Service 'Light pollution data upload facility'

[Use this service from your browser]

Further access options are discussed below

We give continuous night and day light measurements at all natural outdoor light levels by a network of low-cost lightmeters. Developed to start simple, global continuous high cadence monitoring of night sky brightness and artificial night sky brightening (light pollution) in 2009. The lightmeter network is a project of the Thüringer Landessternwarte, Tautenburg, Germany and the Kuffner-Sternwarte society at the Kuffner-Observatory, Vienna, Austria. It started as part of the Dark Skies Awareness cornerstone of the International Year of Astronomy.

For a list of all services and tables belonging to this service's resource, see Information on resource 'Lightmeter Data'

Service Documentation

Getting a Station ID

To upload into GAVO's light pollution database, you need to obtain a station identifier; this uniquely describes the combination of sensor and location. If you change either, please get a new station id.

To get a station id, please fill out the following ASCII form and send it to gavo@ari.uni-heidelberg.de:

fullname:
stationId:
long:
lat:
height:
type:

Here,

fullname
is a nice, descriptive title for your station ("Lightmeter of the roof of stargazers' lair"),
stationid
is all uppercase (country)_(city)_(count), e.g., DE_HEIDELBERG_2. We may need to change your suggestion. Please give
long and lat
in decimal degrees, with longitudes west of Greenwhich having a minus sign.
height
should be in meters.
type
would currently be one of IYA Lightmeter or SQM-LU; if neither fits for you, contact us.

Here's an example for our Heidelberg station:

fullname: Heidelberg ARI Altbau, new device
stationId: DE_HEIDELBERG_2
long: 8.68813
lat: 49.417645
height: 115
accessKey: Leevae4i
type: IYA Lightmeter

Note that we currently do not support mobile stations. If you have data from devices that change their location frequently, please let us know.

Upload Formats

You can upload both CSV and "text" format. Text format consists of lines giving whitespace-separated lines of:

date time(utc) temperature count status

Such a file could look like this:

2009-05-29 00:00:01  11,3  235160 1
2009-05-29 00:00:02  11,3  235240 1
2009-05-29 00:00:03  11,3  235320 1
2009-05-29 00:00:04  11,3  235320 1
2009-05-29 00:00:05  11,3  235264 1
2009-05-29 00:00:06  11,3  234864 1
2009-05-29 00:00:07  11,3  235360 1

The CSV format has semicolons as separators and must not have headers. It should look like this:

27.04.2009;12:31:14;36,9;°C;1942560;ok;
27.04.2009;12:31:15;37,0;°C;1947960;ok;
27.04.2009;12:31:16;37,0;°C;1951800;ok;
27.04.2009;12:31:17;36,9;°C;1943880;ok;
27.04.2009;12:31:18;37,0;°C;1947960;ok;
27.04.2009;12:31:19;36,9;°C;1956960;ok;

We will accept floating point numbers with both commas and decimal points as decimal separators.

You can gzip your submissions before transfer.

The receiving software is somewhat naive and infers the content from the file name extension. Legal extensions are:

Make sure you follow these conventions.

You can also upload ZIP archives of such files. The only legal extension here is zip (lower case!).

There is an upload limit of 20 MB on the data center software, i.e., you cannot upload a single file larger than 20 MB. If you try, your client will probably just say something to the effect of "connection reset". So, try to keep your uploads reasonably small.

Automatic Uploads

Of course, manual uploads will become quite tedious. Therefore, we provide an automatic upload facility. While you can use anything that can do HTTP uploads (your the file goes into the inFile key, and you must give a __nevow_form__ key with the value upload), we provide a python script that already does everything (including automatic transfer compression). The program is written for python 2.x; unless you have a very good reason, you should not use python 3.x just yet, and if you have one, please feed back the (minor) patches to make the uploader work with 3.x (while keeping it ok for 2.x).

To use it, put a file named "stationinfo" into the directory from which you will upload. It must contain the station id, a blank, and the access key.

Then, in a shell, say (adapt for the location of your python interpreter and the script as necessary):

python uploadLM.py FILE1 FILE2...

We expect uploading will be included in some of the readout software.

You can also just use curl; this would look like this:

curl http://dc.g-vo.org/lightmeter/q/upload/custom/STATION_ID/ACCESS_KEY \
-F __nevow_form__=upload -F inFile=@PATH_TO_FILE

(of course, you need to adapt everything in ALL_CAPS). In what comes back, check the first paragraph of the div element with id body. This should contain something like "File FILENAME uploaded, XY bytes".

Full Driver Software

We also provide a full driver software that can run on Raspberry PI or similar small computers. It also offers a web interface that lets you see your lightmeter's status (here's how this currently looks like for the lightmeter in Heidelberg).

You will probably need some help to set this up -- please contact gavo@ari.uni-heidelberg.de as necessary. The necessary code is available from http://svn.ari.uni-heidelberg.de/svn/gavo/hdinputs/lightmeter/src, you want lightmeter.py and, unless you happen to have a local installation of python-libusb1, the usb1.py and libusb1.py modules.

Overview

You can access this service using:

This resource is not (directly) published. This can mean that it was deemed too unimportant, for internal use only, or is just a helper for a published service. Equally likely, however, it is under development, abandoned in development or otherwise unfinished. Exercise some caution.

Other services provided on the underlying data include:

This database of lightmeter measurements is made available under the Open Database License: http://opendatacommons.org/licenses/odbl/1.0/.

Any rights in individual contents of the database are licensed under the Database Contents License: http://opendatacommons.org/licenses/dbcl/1.0/

Citation Info

VOResource XML (that's something exclusively for VO nerds)