Advertisement
Guest User

Untitled

a guest
Feb 17th, 2020
251
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 7.38 KB | None | 0 0
  1. ----Create a database---
  2. $ influx
  3. > CREATE DATABASE "NAME"
  4. Use a database
  5. >use NAME OF DATABASE
  6. Drop Series
  7. > DROP SERIES FROM "NAME"
  8. Drop Measurment
  9. >DROP MEASUREMENT "NAME"
  10. See measurment
  11. >SELECT * FROM "NAME"
  12. See with timestamp
  13. >SELECT * FROM "h2o_feet" WHERE time > now() - 7d
  14. Delete measurement with time val
  15. >DELETE FROM "NAME" WHERE time < 'TIME'
  16. Insert manual Value with time
  17. >INSERT treasures,captain_id=pirate_king value=2 15556874561
  18. ^--measurement--^--tag-------------^-value---^--timestamp--
  19. Last values
  20. >select * from WVdata where time > now() - 1h ORDER BY DESC limit 10
  21. Field Types
  22. >show field keys from "FIELD"
  23.  
  24. Duplicate Measurements //-1st- create exact copy with name, -2nd- creates new measurement name
  25. >use firstdatabase
  26. >SELECT * INTO "database2"..:MEASUREMENT from "firstmeasurement" GROUP BY *
  27. or
  28. > SELECT * INTO "database2"..YOUR_NEW_MEASUREMENT from "firstmeasurement" GROUP BY *
  29.  
  30. COUNT METRICS
  31. >select count(*) from "measurement"
  32. Check Size
  33. $ sudo du -sh /var/lib/influxdb/data/DATABASE
  34. Find diffrence min/max value by day and timezone
  35. SELECT spread(value) FROM testing WHERE ID='L1E' AND time > now() - 1d GROUP BY time(1d) tz('Europe/Athens)
  36.  
  37.  
  38. ---GET-----
  39. $influx -database 'csv_WVdata' -host '160.40.49.235' -username 'influxadmin' -password 'admin' -execute 'select * from WVdata limit 1'
  40. --------
  41.  
  42. ---Show Databases--
  43. $ influx -execute 'SHOW DATABASES'
  44.  
  45. ---Execute queries that do require a database specification, and change the timestamp precision:----
  46. $ influx -execute 'SELECT * FROM "SERIES" LIMIT 3' -database="DATABASE" -precision=rfc3339
  47.  
  48. ---Specify the format of the server responses with -format---
  49. The default format is column:
  50. $ influx -format=column ------------- Change the format to csv:$ influx -format=csv --------- Change the format to json:$ influx -format=json -pretty
  51.  
  52. ----Export data to Exel file---
  53. $ sudo /usr/bin/influx -precision rfc3339 -database 'openhab_db' -host 'localhost' -username 'openhab' -password 'admin' -execute 'select * from InfluxLogging_Commands' -format 'csv' > /home/InfluxToExel/export.csv
  54. ^-RFC3339 format for timestamp ^--name of DB ^--your url to DB-- ^----DB Table------ ^---path to file--^--name of the file
  55.  
  56. ----------------------with limit-----------------------------------------------------------------------
  57. $influx -database 'openhab_db' -host 'localhost' -username 'openhab' -password 'admin' -execute 'select * from L1P_Avrg limit 3' -format 'csv' > /home/InfluxToExel/export.csv
  58.  
  59. influx -database 'csv_WattVolt' -host 'localhost' -username 'influxadmin' -password 'admin' -execute 'select * from WVdata' -format 'column' > /etc/telegraf/export.txt <-----------testing
  60.  
  61.  
  62.  
  63. Also --You can skip sudo /usr/bin/ part and start with influx
  64.  
  65.  
  66. ----------------------------post it with curl----------------------------------[
  67. $curl -i -XPOST "http://influxdb_hostname:port/write?db=YOURDATABASENAME&precision=s" -u username:password --data-binary @YOURFILE.txt
  68. ||||
  69. -----example----------------------------------------------------------------------------------------------
  70. $ curl -i -XPOST "http://160.40.48.86:8086/write?db=import&precision=s" --data-binary @import.txt
  71.  
  72.  
  73.  
  74. ------Export data to json with Get including, chunked(limit 1 result per line,limit timestamp last 1 day)---------------
  75. $curl -G 'http://localhost:8086/query' --data-urlencode "db=openhab_db" --data-urlencode "chunked=true" --data-urlencode "chunk_size=1" --data-urlencode "q=SELECT * FROM InfluxLogging_Commands where time > now() - 1d" > /home/InfluxToExel/exporttest2.json
  76.  
  77. ------Export multiple data to json with Get including, chunked(limit 1 result per line,limit timestamp last 1 day)---------------
  78. $sudo curl -G 'http://localhost:8086/query' --data-urlencode "db=openhab_db" --data-urlencode "chunked=true" --data-urlencode "chunk_size=1" --data-urlencode "q=SELECT * FROM InfluxLogging_Commands where time > now() - 1d; SELECT * FROM L1P_Avrg where time > now() - 1d" > /home/InfluxToExel/exporttest2.json
  79.  
  80.  
  81.  
  82. ------Chunking(split results per line)------------
  83. curl -G 'http://localhost:8086/query' --data-urlencode "db=deluge" --data-urlencode "chunked=true" --data-urlencode "chunk_size=20000" --data-urlencode "q=SELECT * FROM liters"
  84.  
  85. ^---chunk set---- ^---how many per line----
  86.  
  87. --------EXAMPLE QUERY----------------------------------------------------
  88. curl -G 'http://160.40.49.235:8086/query' -u influxadmin:admin --data-urlencode "db=BESSRES" --data-urlencode "chunked=true" --data-urlencode "chunk_size=1" --data-urlencode 'q=SELECT "mx:Energy_preds" FROM "House_01" WHERE time > 10d LIMIT 10'
  89.  
  90. influx -precision=rfc3339 -database 'BESSRES' -host '160.40.49.235' -username 'influxadmin' -password 'admin' -execute 'select "mx:Energy" from House_01 WHERE time > now() - 10d ORDER BY DESC LIMIT 10'
  91.  
  92. --------------------MULTI QUERY----------------------------------
  93. curl -G 'http://localhost:8086/query?pretty=true' -u influxadmin:admin --data-urlencode "db=openhab_db" --data-urlencode "q=SELECT * FROM L1A;SELECT * FROM L1A2;SELECT * FROM L1A3;SELECT * FROM L1P_Avrg;SELECT * FROM L1P2_Avrg;SELECT * FROM L1P3_Avrg;SELECT * FROM L1V;SELECT * FROM L1V2;SELECT * FROM L1V3;SELECT * FROM L1VA;SELECT * FROM L1VA2;SELECT * FROM L1VA3;SELECT * FROM L1E;SELECT * FROM L1E2;SELECT * FROM L1E3;SELECT * FROM L1E_Avrg;SELECT * FROM L1E2_Avrg;SELECT * FROM L1E3_Avrg" -H "Accept: application/csv" > /home/isaioglou/export.csv
  94.  
  95. curl -G 'http://localhost:8086/query?pretty=true' -u influxadmin:admin --data-urlencode "db=3ph_gavazzi" --data-urlencode "q=SELECT value FROM L1A,L1C,L1E,L1E_Avrg,L1F,L1P,L1P_Avrg,L1V,Weather_Humidity,Weather_Temperature,System_Temperature_CPU,System_Temperature_GPU WHERE time > now() - 20d LIMIT 1" -H "Accept: application/csv" > /home/isaioglou/export.csv
  96.  
  97. --------------------1ph_Gavazzi--------------------------------------------------------------------------------------------
  98. curl -G 'http://localhost:8086/query?pretty=true' -u influxadmin:admin --data-urlencode "db=3ph_gavazzi" --data-urlencode "q=SELECT value FROM L1A,L1C,L1E,L1E_Avrg,L1F,L1P,L1P_Avrg,L1V,Weather_Humidity,Weather_Temperature,System_Temperature_CPU,System_Temperature_GPU WHERE time > now() - 2d" -H "Accept: application/csv" > /home/isaioglou/"1phGavazzi_$(date +%F)".csv
  99. --------------------3ph_Gavazzi--------------------------------------------------------------------------------------------
  100. curl -G 'http://localhost:8086/query' --data-urlencode "db=openhab_db" --data-urlencode "q=SELECT value FROM L1A,L1A2,L1A3,L1P,L1P2,L1P3,L1P_Avrg,L1P2_Avrg,L1P3_Avrg,L1E,L1E2,L1E3,L1E_Avrg,L1E2_Avrg,L1E3_Avrg,Weather_Humidity,Weather_Temperature,System_Temperature_CPU,System_Temperature_GPU WHERE time > now() - 2d" -H "Accept: application/csv" > /home/isaioglou/"1phGavazzi_$(date +%F)".csv
  101.  
  102.  
  103. ---------------CURL MULTI EXPORT TO CSV FROM DATABASE---------------------------------------------------------------------------------------
  104.  
  105. curl -G 'http://localhost:8086/query?pretty=true' -u influxadmin:admin --data-urlencode "db=trololo" --data-urlencode "q=SELECT * FROM napo;SELECT * FROM napo_v2" -H "Accept: application/csv" > /home/isaioglou/export.csv
  106.  
  107. { DEL -H "Accept: application/csv" if you want json }
  108. You can use jq to convert the JSON output to CSV as follows, which also allows you to get RFC3339 formatted timestamps:
  109.  
  110. jq -r "(.results[0].series[0].columns), (.results[0].series[0].values[]) | @csv"
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement