Hi there,
First post, so if i'm in the wrong place, shout! I have a DT80/2 which i'm integrating into a system for a client. The system will monitor several thermistors, and control visual alarm outputs through a PCB relay board I have had printed.
The monitoring is every 1 minute on job schedule A. Every 5 minutes, i want to upload recorded data to an external FTP server, which will handle the processing of the data for this unit, and others for bespoke web viewing. The fact that the DT80 has its own internal web server is nice, but irrelevant in this case.
In this instance, the unit is connected to a fast internet access gateway, and therefore the issues of FTP PUSH are not as critical, however I want to use this type of system on externally located equipment which will connect over GPRS, possibly internationally, and therefore there is likely to be a significant impact on quantity of data transmission vs cost.
Currently, the DT80/2 can unload the entire store file via FTP to a CSV on a web server, or can handle pre-built datataker commands to limit or specify the data to unload.
I am also aware that the datataker language cannot process "home built" time or date strings, so its not possible to use the alarm functions to drive routines to calculate exactly how much data hasn't been uploaded, and set the U command accordingly.
My first point would be, is it possible to add this to a wish list? My current thinking is that to restrict the data upload through FTP the most efficient method is to restrict the data store size.
Easily done, but the cost is redundancy, as if the file size is restricted, the data may be over written before the next successful FTP upload is made, thus losing data. There is the option to movedata on successful 29SV=2 but if the sampling time is 1 minute, and the FTP of the data store (especially poignant over GPRS) takes 2, and a movedata command is executed on success, then some new data will be copied to archive in DBD format, and not be transliterated to the FTP server in CSV format.
Hopefully, the dilemma is now apparent? My second question is: has anyone developed a solution to get round the restrictions for unloading data from the DT80/2 which reliably minimizes the data offload PUSH through FTP?
Thirdly, I have found the DT80 to be an awesome product, but its support software (DeLogger5 & Pro) to be frustrating, especially considering the price of the upgrade, and what buying Pro gets you. There are LOTS of bugs which make using the GUI unreliable. a real let down considering the quality of the logger. Third point merely for comment, main issue is FTP unloading.
Any responses greatly received.
pjmoseley
Hi there,
First post, so if i'm in the wrong place, shout! I have a DT80/2 which i'm integrating into a system for a client. The system will monitor several thermistors, and control visual alarm outputs through a PCB relay board I have had printed.
The monitoring is every 1 minute on job schedule A. Every 5 minutes, i want to upload recorded data to an external FTP server, which will handle the processing of the data for this unit, and others for bespoke web viewing. The fact that the DT80 has its own internal web server is nice, but irrelevant in this case.
In this instance, the unit is connected to a fast internet access gateway, and therefore the issues of FTP PUSH are not as critical, however I want to use this type of system on externally located equipment which will connect over GPRS, possibly internationally, and therefore there is likely to be a significant impact on quantity of data transmission vs cost.
Currently, the DT80/2 can unload the entire store file via FTP to a CSV on a web server, or can handle pre-built datataker commands to limit or specify the data to unload.
I am also aware that the datataker language cannot process "home built" time or date strings, so its not possible to use the alarm functions to drive routines to calculate exactly how much data hasn't been uploaded, and set the U command accordingly.
My first point would be, is it possible to add this to a wish list? My current thinking is that to restrict the data upload through FTP the most efficient method is to restrict the data store size.
Easily done, but the cost is redundancy, as if the file size is restricted, the data may be over written before the next successful FTP upload is made, thus losing data. There is the option to movedata on successful 29SV=2 but if the sampling time is 1 minute, and the FTP of the data store (especially poignant over GPRS) takes 2, and a movedata command is executed on success, then some new data will be copied to archive in DBD format, and not be transliterated to the FTP server in CSV format.
Hopefully, the dilemma is now apparent? My second question is: has anyone developed a solution to get round the restrictions for unloading data from the DT80/2 which reliably minimizes the data offload PUSH through FTP?
Thirdly, I have found the DT80 to be an awesome product, but its support software (DeLogger5 & Pro) to be frustrating, especially considering the price of the upgrade, and what buying Pro gets you. There are LOTS of bugs which make using the GUI unreliable. a real let down considering the quality of the logger. Third point merely for comment, main issue is FTP unloading.
Any responses greatly received.
pjmoseley