--- /dev/null
+---
+name: "\U0001F41B Bug report"
+about: Something isn't working correctly with an add-on. This is the wrong place for user-interfaces or openHAB Core issues.
+labels: bug
+
+---
+
+<!-- Provide a general summary of the issue in the *Title* above -->
+<!-- If the issue is related to a binding, please include its short name in -->
+<!-- square brackets in the title - Example: "[astro] My issue..." -->
+
+<!-- Important: Please contact the openHAB community forum for questions or -->
+<!-- for configuration and usage guidance: https://community.openhab.org -->
+
+<!-- Feel free to delete any comment lines in the template (starting with "<!--") -->
+
+## Expected Behavior
+<!-- If you're describing a bug, tell us what should happen -->
+<!-- If you're suggesting a change/improvement, tell us how it should work -->
+
+## Current Behavior
+<!-- If describing a bug, tell us what happens instead of the expected behavior -->
+<!-- Include related log information (preferably debug level) and related configs -->
+<!-- Use a file attachment for log and config information longer than a few lines -->
+<!-- Enclose multi-line log/code snippets with ``` on new lines for proper formatting -->
+<!-- If suggesting a change/improvement, explain the difference from current behavior -->
+<!-- For improvements, discuss at community.openhab.org first and include link to topic -->
+
+## Possible Solution
+<!-- Not obligatory, but suggest a fix/reason for the bug, -->
+<!-- or ideas how to implement the addition or change -->
+
+## Steps to Reproduce (for Bugs)
+<!-- Provide a link to a live example, or an unambiguous set of steps to -->
+<!-- reproduce this bug. Include code to reproduce, if relevant -->
+1.
+2.
+
+## Context
+<!-- How has this issue affected you? What are you trying to accomplish? -->
+<!-- Providing context helps us come up with a solution that is most useful in the real world -->
+
+## Your Environment
+<!-- Include as many relevant details about the environment you experienced the bug in -->
+* Version used: (e.g., openHAB and add-on versions)
+* Environment name and version (e.g. Chrome 76, Java 8, Node.js 12.9, ...):
+* Operating System and version (desktop or mobile, Windows 10, Raspbian Buster, ...):
--- /dev/null
+---
+name: "Documentation issue"
+about: Some information within the add-on documentation is wrong or missing
+labels: documentation
+
+---
+<!-- Please report only add-on related documentation issues here -->
+<!-- Documentation issues within user interfaces or the core should be -->
+<!-- reported at https://github.com/openhab/openhab-docs/issues/new -->
+
+<!-- Provide a general summary of the documentation issue in the *Title* above -->
+<!-- If the documentation issue is related to a specific add-on, please include its short name in -->
+<!-- square brackets in the title - Example: "[astro] My documentation issue..." -->
+
+<!-- Important: Please contact the openHAB community forum for questions or -->
+<!-- for configuration and usage guidance: https://community.openhab.org -->
+
--- /dev/null
+---
+name: "Feature request"
+about: You think that your favorite add-on should gain another feature
+labels: enhancement
+
+---
+
+<!-- Provide a general summary of the feature request in the *Title* above -->
+<!-- If the feature request is related to an add-on, please include its short name in -->
+<!-- square brackets in the title - Example: "[astro] My feature request..." -->
+
+<!-- Important: Please contact the openHAB community forum for questions or -->
+<!-- for configuration and usage guidance: https://community.openhab.org -->
+
+## Your Environment
+<!-- Include as many relevant details about the environment when applicable -->
+* Version used: (e.g., openHAB and add-on versions)
+* Environment name and version (e.g. Chrome 76, Java 8, Node.js 12.9, ...):
+* Operating System and version (desktop or mobile, Windows 10, Raspbian Buster, ...):
--- /dev/null
+---
+name: "\U0001F914 Support/Usage Question"
+about: For usage questions, please use the openHAB community board!
+labels: question
+
+---
+
+This is an issue tracker for reporting problems and requesting new features. For usage questions, please use the openHAB community board where there are a lot more people ready to help you out. Thanks!
+
+https://community.openhab.org/
--- /dev/null
+<!--
+Thanks for contributing to the openHAB project!
+Please describe the goal and effect of your PR here.
+Pay attention to the below notes and to *the guidelines* for this repository.
+Feel free to delete any comment sections in the template (starting with "<!--").
+-->
+
+<!-- TITLE -->
+
+<!--
+Please provide a PR summary in the *Title* above, according to the following schema:
+- If related to one specific add-on: Mention the add-on shortname in square brackets
+ e.g. "[exec]", "[netatmo]" or "[tesla]"
+- If the PR is work in progress: Add "[WIP]"
+- Give a short meaningful description in imperative mood
+ e.g. "Add support for device XYZ" or "Fix wrongly handled exception"
+ for a new add-on/binding: "Initial contribution"
+Examples:
+- "[homematic] Improve communication with weak signal devices"
+- "[timemachine][WIP] Initial contribution"
+- "Update contribution guidelines on new signing rules"
+-->
+
+<!-- DESCRIPTION -->
+
+<!--
+Please give a few sentences describing the overall goals of the pull request.
+Give enough details to make the improvement and changes of the PR understandable
+to both developers and tech-savy users.
+
+Please keep the following in mind:
+- What is the classification of the PR, e.g. Bugfix, Improvement, Novel Addition, ... ?
+- Did you describe the PRs motivation and goal?
+- Did you provide a link to any prior discussion, e.g. an issue or community forum thread?
+- Did you describe new features for the end user?
+- Did you describe any noteworthy changes in usage for the end user?
+- Was the documentation updated accordingly, e.g. the add-on README?
+- Does your contribution follow the coding guidelines:
+ https://www.openhab.org/docs/developer/development/guidelines.html
+- Did you check for any (relevant) issues from the static code analysis:
+ https://www.openhab.org/docs/developer/development/bindings.html#static-code-analysis
+- Did you sign-off your work:
+ https://www.openhab.org/docs/developer/contributing/contributing.html#sign-your-work
+-->
+
+<!-- TESTING -->
+
+<!--
+Your Pull Request will automatically be built and available under the following folder:
+https://openhab.jfrog.io/openhab/libs-pullrequest-local/org/openhab/
+
+It is a good practice to add a URL to your built JAR in this Pull Request description,
+so it is easier for the community to test your Add-on.
+If your Pull Request contains a new binding, it will likely take some time
+before it is reviewed and processed by maintainers.
+That said, consider submitting your Add-on in the Eclipse IoT Marketplace
+See this thread for more info:
+https://community.openhab.org/t/24491
+
+Don't forget to submit a thread about your Add-on in the openHAB community:
+https://community.openhab.org/c/add-ons
+-->
--- /dev/null
+.antlr*
+.idea
+.DS_Store
+*.iml
+npm-debug.log
+.build.log
+
+.metadata/
+bin/
+target/
+src-gen/
+xtend-gen/
+.history/
+
+*/plugin.xml_gen
+**/.settings/org.eclipse.*
+
+bundles/**/src/main/history
+features/**/src/main/history
+features/**/src/main/feature
+
+.vscode
+.factorypath
--- /dev/null
+os: linux
+dist: focal
+
+language: java
+jdk: openjdk11
+
+cache:
+ directories:
+ - $HOME/.m2
+
+before_cache:
+ # remove resolver-status.properties, they change with each run and invalidate the cache
+ - find $HOME/.m2 -name resolver-status.properties -exec rm {} \;
+
+notifications:
+ webhooks: https://www.travisbuddy.com/
+
+travisBuddy:
+ insertMode: update
+ successBuildLog: true
+
+install: true
+script: ./buildci.sh "$TRAVIS_COMMIT_RANGE"
--- /dev/null
+# This file helps GitHub doing automatic review requests for new PRs.
+# It should always list the active maintainers of certain add-ons.
+
+# As a fallback, if no specific maintainer is listed below, assign the PR to the repo maintainers team:
+* @openhab/add-ons-maintainers
+
+# Add-on maintainers:
+/bundles/org.openhab.binding.airquality/ @kubawolanin
+/bundles/org.openhab.binding.airvisualnode/ @3cky
+/bundles/org.openhab.binding.allplay/ @dominicdesu
+/bundles/org.openhab.binding.amazondashbutton/ @OLibutzki
+/bundles/org.openhab.binding.amazonechocontrol/ @mgeramb
+/bundles/org.openhab.binding.ambientweather/ @mhilbush
+/bundles/org.openhab.binding.astro/ @gerrieg
+/bundles/org.openhab.binding.atlona/ @tmrobert8
+/bundles/org.openhab.binding.autelis/ @digitaldan
+/bundles/org.openhab.binding.avmfritz/ @cweitkamp
+/bundles/org.openhab.binding.bigassfan/ @mhilbush
+/bundles/org.openhab.binding.bluetooth/ @cdjackson @kaikreuzer
+/bundles/org.openhab.binding.bluetooth.bluegiga/ @cdjackson @kaikreuzer
+/bundles/org.openhab.binding.bluetooth.bluez/ @cdjackson @kaikreuzer
+/bundles/org.openhab.binding.bluetooth.blukii/ @kaikreuzer
+/bundles/org.openhab.binding.bluetooth.ruuvitag/ @ssalonen
+/bundles/org.openhab.binding.boschindego/ @jofleck
+/bundles/org.openhab.binding.bosesoundtouch/ @marvkis @tratho
+/bundles/org.openhab.binding.buienradar/ @gedejong
+/bundles/org.openhab.binding.chromecast/ @kaikreuzer
+/bundles/org.openhab.binding.cm11a/ @BobRak
+/bundles/org.openhab.binding.coolmasternet/ @projectgus
+/bundles/org.openhab.binding.daikin/ @caffineehacker @psmedley
+/bundles/org.openhab.binding.darksky/ @cweitkamp
+/bundles/org.openhab.binding.deconz/ @davidgraeff
+/bundles/org.openhab.binding.denonmarantz/ @jwveldhuis
+/bundles/org.openhab.binding.digiplex/ @rmichalak
+/bundles/org.openhab.binding.digitalstrom/ @MichaelOchel @msiegele
+/bundles/org.openhab.binding.dlinksmarthome/ @MikeJMajor
+/bundles/org.openhab.binding.dmx/ @J-N-K
+/bundles/org.openhab.binding.doorbird/ @mhilbush
+/bundles/org.openhab.binding.dscalarm/ @RSStephens
+/bundles/org.openhab.binding.dsmr/ @Hilbrand
+/bundles/org.openhab.binding.dwdunwetter/ @limdul79
+/bundles/org.openhab.binding.elerotransmitterstick/ @vbier
+/bundles/org.openhab.binding.enocean/ @fruggy83
+/bundles/org.openhab.binding.enturno/ @klocsson
+/bundles/org.openhab.binding.evohome/ @Nebula83
+/bundles/org.openhab.binding.exec/ @kgoderis
+/bundles/org.openhab.binding.feed/ @svilenvul
+/bundles/org.openhab.binding.feican/ @Hilbrand
+/bundles/org.openhab.binding.folding/ @fa2k
+/bundles/org.openhab.binding.foobot/ @airboxlab @Hilbrand
+/bundles/org.openhab.binding.freebox/ @lolodomo
+/bundles/org.openhab.binding.fronius/ @trokohl
+/bundles/org.openhab.binding.fsinternetradio/ @paphko
+/bundles/org.openhab.binding.ftpupload/ @paulianttila
+/bundles/org.openhab.binding.gardena/ @gerrieg
+/bundles/org.openhab.binding.globalcache/ @mhilbush
+/bundles/org.openhab.binding.gpstracker/ @gbicskei
+/bundles/org.openhab.binding.groheondus/ @FlorianSW
+/bundles/org.openhab.binding.harmonyhub/ @digitaldan
+/bundles/org.openhab.binding.hdanywhere/ @kgoderis
+/bundles/org.openhab.binding.hdpowerview/ @beowulfe
+/bundles/org.openhab.binding.helios/ @kgoderis
+/bundles/org.openhab.binding.heos/ @Wire82
+/bundles/org.openhab.binding.homematic/ @FStolte @gerrieg @mdicke2s
+/bundles/org.openhab.binding.hpprinter/ @cossey
+/bundles/org.openhab.binding.hue/ @cweitkamp
+/bundles/org.openhab.binding.hydrawise/ @digitaldan
+/bundles/org.openhab.binding.hyperion/ @tavalin
+/bundles/org.openhab.binding.iaqualink/ @digitaldan
+/bundles/org.openhab.binding.icloud/ @pgfeller
+/bundles/org.openhab.binding.ihc/ @paulianttila
+/bundles/org.openhab.binding.innogysmarthome/ @ollie-dev
+/bundles/org.openhab.binding.ipp/ @peuter
+/bundles/org.openhab.binding.irtrans/ @kgoderis
+/bundles/org.openhab.binding.jeelink/ @vbier
+/bundles/org.openhab.binding.keba/ @kgoderis
+/bundles/org.openhab.binding.km200/ @Markinus
+/bundles/org.openhab.binding.knx/ @sjka
+/bundles/org.openhab.binding.kodi/ @pail23 @cweitkamp
+/bundles/org.openhab.binding.konnected/ @volfan6415
+/bundles/org.openhab.binding.kostalinverter/ @cschneider
+/bundles/org.openhab.binding.lametrictime/ @syphr42
+/bundles/org.openhab.binding.leapmotion/ @kaikreuzer
+/bundles/org.openhab.binding.lghombot/ @FluBBaOfWard
+/bundles/org.openhab.binding.lgtvserial/ @fa2k
+/bundles/org.openhab.binding.lgwebos/ @sprehn
+/bundles/org.openhab.binding.lifx/ @wborn
+/bundles/org.openhab.binding.linuxinput/ @t-8ch
+/bundles/org.openhab.binding.lirc/ @kabili207
+/bundles/org.openhab.binding.logreader/ @paulianttila
+/bundles/org.openhab.binding.loxone/ @ppieczul
+/bundles/org.openhab.binding.lutron/ @actong @bobadair
+/bundles/org.openhab.binding.mail/ @J-N-K
+/bundles/org.openhab.binding.max/ @marcelrv
+/bundles/org.openhab.binding.mcp23017/ @aogorek
+/bundles/org.openhab.binding.melcloud/ @lucacalcaterra @paulianttila @thewiep
+/bundles/org.openhab.binding.meteoblue/ @9037568
+/bundles/org.openhab.binding.meteostick/ @cdjackson
+/bundles/org.openhab.binding.miele/ @kgoderis
+/bundles/org.openhab.binding.mihome/ @pboos
+/bundles/org.openhab.binding.miio/ @marcelrv
+/bundles/org.openhab.binding.millheat/ @seime
+/bundles/org.openhab.binding.milight/ @davidgraeff
+/bundles/org.openhab.binding.minecraft/ @ibaton
+/bundles/org.openhab.binding.modbus/ @ssalonen
+/bundles/org.openhab.binding.mqtt/ @davidgraeff
+/bundles/org.openhab.binding.mqtt.generic/ @davidgraeff
+/bundles/org.openhab.binding.mqtt.homeassistant/ @davidgraeff
+/bundles/org.openhab.binding.mqtt.homie/ @davidgraeff
+/bundles/org.openhab.binding.nanoleaf/ @raepple
+/bundles/org.openhab.binding.neato/ @jjlauterbach
+/bundles/org.openhab.binding.neeo/ @tmrobert8
+/bundles/org.openhab.binding.neohub/ @andrewfg
+/bundles/org.openhab.binding.nest/ @wborn
+/bundles/org.openhab.binding.netatmo/ @clinique @cweitkamp @lolodomo
+/bundles/org.openhab.binding.network/ @davidgraeff @mettke
+/bundles/org.openhab.binding.networkupstools/ @Hilbrand
+/bundles/org.openhab.binding.nibeheatpump/ @paulianttila
+/bundles/org.openhab.binding.nibeuplink/ @alexf2015
+/bundles/org.openhab.binding.nikobus/ @crnjan
+/bundles/org.openhab.binding.nikohomecontrol/ @mherwege
+/bundles/org.openhab.binding.ntp/ @marcelrv
+/bundles/org.openhab.binding.nuki/ @mkatter
+/bundles/org.openhab.binding.oceanic/ @kgoderis
+/bundles/org.openhab.binding.omnikinverter/ @hansbogert
+/bundles/org.openhab.binding.onebusaway/ @sdwilsh
+/bundles/org.openhab.binding.onewiregpio/ @aogorek
+/bundles/org.openhab.binding.onewire/ @J-N-K
+/bundles/org.openhab.binding.onkyo/ @pail23 @paulianttila
+/bundles/org.openhab.binding.opengarage/ @psmedley
+/bundles/org.openhab.binding.opensprinkler/ @CrackerStealth @FlorianSW
+/bundles/org.openhab.binding.openuv/ @clinique
+/bundles/org.openhab.binding.openweathermap/ @cweitkamp
+/bundles/org.openhab.binding.orvibo/ @tavalin
+/bundles/org.openhab.binding.paradoxalarm/ @theater
+/bundles/org.openhab.binding.pentair/ @jsjames
+/bundles/org.openhab.binding.phc/ @gnlpfjh
+/bundles/org.openhab.binding.pioneeravr/ @Stratehm
+/bundles/org.openhab.binding.pixometer/ @Confectrician
+/bundles/org.openhab.binding.pjlinkdevice/ @nils
+/bundles/org.openhab.binding.plclogo/ @falkena
+/bundles/org.openhab.binding.plugwise/ @wborn
+/bundles/org.openhab.binding.powermax/ @lolodomo
+/bundles/org.openhab.binding.pulseaudio/ @peuter
+/bundles/org.openhab.binding.pushbullet/ @hakan42
+/bundles/org.openhab.binding.regoheatpump/ @crnjan
+/bundles/org.openhab.binding.rfxcom/ @martinvw @paulianttila
+/bundles/org.openhab.binding.rme/ @kgoderis
+/bundles/org.openhab.binding.robonect/ @reyem
+/bundles/org.openhab.binding.rotel/ @lolodomo
+/bundles/org.openhab.binding.rotelra1x/ @fa2k
+/bundles/org.openhab.binding.russound/ @tmrobert8
+/bundles/org.openhab.binding.samsungtv/ @paulianttila
+/bundles/org.openhab.binding.satel/ @druciak
+/bundles/org.openhab.binding.seneye/ @nikotanghe
+/bundles/org.openhab.binding.sensebox/ @hakan42
+/bundles/org.openhab.binding.serialbutton/ @kaikreuzer
+/bundles/org.openhab.binding.shelly/ @markus7017
+/bundles/org.openhab.binding.siemensrds/ @andrewfg
+/bundles/org.openhab.binding.silvercrestwifisocket/ @jmvaz
+/bundles/org.openhab.binding.sinope/ @chaton78
+/bundles/org.openhab.binding.sleepiq/ @syphr42
+/bundles/org.openhab.binding.smaenergymeter/ @monnimeter
+/bundles/org.openhab.binding.smartmeter/ @msteigenberger
+/bundles/org.openhab.binding.snmp/ @J-N-K
+/bundles/org.openhab.binding.solaredge/ @alexf2015
+/bundles/org.openhab.binding.solarlog/ @johannrichard
+/bundles/org.openhab.binding.somfytahoma/ @octa22
+/bundles/org.openhab.binding.sonos/ @kgoderis @lolodomo
+/bundles/org.openhab.binding.sonyaudio/ @freke
+/bundles/org.openhab.binding.sonyprojector/ @lolodomo
+/bundles/org.openhab.binding.spotify/ @Hilbrand
+/bundles/org.openhab.binding.squeezebox/ @digitaldan @mhilbush
+/bundles/org.openhab.binding.synopanalyzer/ @clinique
+/bundles/org.openhab.binding.systeminfo/ @svilenvul
+/bundles/org.openhab.binding.tado/ @dfrommi
+/bundles/org.openhab.binding.tankerkoenig/ @dolic @JueBag
+/bundles/org.openhab.binding.telegram/ @ZzetT
+/bundles/org.openhab.binding.tellstick/ @jarlebh
+/bundles/org.openhab.binding.tesla/ @kgoderis
+/bundles/org.openhab.binding.toon/ @jongj
+/bundles/org.openhab.binding.tplinksmarthome/ @Hilbrand
+/bundles/org.openhab.binding.tradfri/ @cweitkamp @kaikreuzer
+/bundles/org.openhab.binding.unifi/ @mgbowman
+/bundles/org.openhab.binding.urtsi/ @OLibutzki
+/bundles/org.openhab.binding.valloxmv/ @bjoernbrings
+/bundles/org.openhab.binding.vektiva/ @octa22
+/bundles/org.openhab.binding.velbus/ @cedricboon
+/bundles/org.openhab.binding.vitotronic/ @steand
+/bundles/org.openhab.binding.volvooncall/ @clinique
+/bundles/org.openhab.binding.weathercompany/ @mhilbush
+/bundles/org.openhab.binding.weatherunderground/ @lolodomo
+/bundles/org.openhab.binding.wemo/ @hmerk
+/bundles/org.openhab.binding.wifiled/ @rvt @xylo
+/bundles/org.openhab.binding.windcentrale/ @marcelrv
+/bundles/org.openhab.binding.xmltv/ @clinique
+/bundles/org.openhab.binding.xmppclient/ @pavel-gololobov
+/bundles/org.openhab.binding.yamahareceiver/ @davidgraeff @zarusz
+/bundles/org.openhab.binding.yeelight/ @claell
+/bundles/org.openhab.binding.zoneminder/ @Mr-Eskildsen
+/bundles/org.openhab.binding.zway/ @pathec
+/bundles/org.openhab.extensionservice.marketplace/ @kaikreuzer
+/bundles/org.openhab.extensionservice.marketplace.automation/ @kaikreuzer
+/bundles/org.openhab.io.azureiothub/ @nikotanghe
+/bundles/org.openhab.io.homekit/ @beowulfe
+/bundles/org.openhab.io.hueemulation/ @davidgraeff @digitaldan
+/bundles/org.openhab.io.imperihome/ @pdegeus
+/bundles/org.openhab.io.javasound/ @kaikreuzer
+/bundles/org.openhab.io.mqttembeddedbroker/ @davidgraeff
+/bundles/org.openhab.io.neeo/ @tmrobert8
+/bundles/org.openhab.io.openhabcloud/ @kaikreuzer
+/bundles/org.openhab.io.transport.modbus/ @ssalonen
+/bundles/org.openhab.io.webaudio/ @kaikreuzer
+/bundles/org.openhab.persistence.mapdb/ @mkhl
+/bundles/org.openhab.persistence.influxdb/ @lujop
+/bundles/org.openhab.transform.exec/ @openhab/add-ons-maintainers
+/bundles/org.openhab.transform.javascript/ @openhab/add-ons-maintainers
+/bundles/org.openhab.transform.jinja/ @jochen314
+/bundles/org.openhab.transform.jsonpath/ @clinique
+/bundles/org.openhab.transform.map/ @openhab/add-ons-maintainers
+/bundles/org.openhab.transform.regex/ @openhab/add-ons-maintainers
+/bundles/org.openhab.transform.scale/ @clinique
+/bundles/org.openhab.transform.xpath/ @openhab/add-ons-maintainers
+/bundles/org.openhab.transform.xslt/ @openhab/add-ons-maintainers
+/bundles/org.openhab.voice.googletts/ @gbicskei
+/bundles/org.openhab.voice.mactts/ @kaikreuzer
+/bundles/org.openhab.voice.marytts/ @kaikreuzer
+/bundles/org.openhab.voice.picotts/ @FlorianSW
+/bundles/org.openhab.voice.pollytts/ @hillmanr
+/bundles/org.openhab.voice.voicerss/ @JochenHiller
+/itests/org.openhab.binding.astro.tests/ @gerrieg
+/itests/org.openhab.binding.avmfritz.tests/ @cweitkamp
+/itests/org.openhab.binding.feed.tests/ @svilenvul
+/itests/org.openhab.binding.hue.tests/ @cweitkamp
+/itests/org.openhab.binding.max.tests/ @marcelrv
+/itests/org.openhab.binding.mqtt.homeassistant.tests/ @davidgraeff
+/itests/org.openhab.binding.mqtt.homie.tests/ @davidgraeff
+/itests/org.openhab.binding.nest.tests/ @wborn
+/itests/org.openhab.binding.ntp.tests/ @marcelrv
+/itests/org.openhab.binding.systeminfo.tests/ @svilenvul
+/itests/org.openhab.binding.tradfri.tests/ @cweitkamp @kaikreuzer
+/itests/org.openhab.binding.wemo.tests/ @hmerk
+/itests/org.openhab.io.hueemulation.tests/ @davidgraeff @digitaldan
+/itests/org.openhab.io.mqttembeddedbroker.tests/ @J-N-K
+/itests/org.openhab.persistence.mapdb.tests/ @mkhl
+
+# PLEASE HELP ADDING FURTHER LINES HERE!
--- /dev/null
+# Contributing to openHAB
+
+Want to hack on openHAB? Awesome! Here are instructions to get you
+started. They are probably not perfect, please let us know if anything
+feels wrong or incomplete.
+
+## Build Environment
+
+For instructions on setting up your development environment, please
+see our dedicated [IDE setup guide](https://www.openhab.org/docs/developer/).
+
+## Contribution guidelines
+
+### Pull requests are always welcome
+
+We are always thrilled to receive pull requests, and do our best to
+process them as fast as possible. Not sure if that typo is worth a pull
+request? Do it! We will appreciate it.
+
+If your pull request is not accepted on the first try, don't be
+discouraged! If there's a problem with the implementation, hopefully you
+received feedback on what to improve.
+
+We're trying very hard to keep openHAB lean and focused. We don't want it
+to do everything for everybody. This means that we might decide against
+incorporating a new feature. However, there might be a way to implement
+that feature *on top of* openHAB.
+
+### Discuss your design in the discussion forum
+
+We recommend discussing your plans [in the discussion forum](https://community.openhab.org/c/add-ons)
+before starting to code - especially for more ambitious contributions.
+This gives other contributors a chance to point you in the right
+direction, give feedback on your design, and maybe point out if someone
+else is working on the same thing.
+
+### Create issues...
+
+Any significant improvement should be documented as [a GitHub
+issue](https://github.com/openhab/openhab-addons/issues?labels=enhancement&page=1&state=open) before anybody
+starts working on it.
+
+### ...but check for existing issues first!
+
+Please take a moment to check that an issue doesn't already exist
+documenting your bug report or improvement proposal. If it does, it
+never hurts to add a quick "+1" or "I have this problem too". This will
+help prioritize the most common problems and requests.
+
+### Conventions
+
+Fork the repo and make changes on your fork in a feature branch.
+
+Submit unit tests for your changes. openHAB has a great test framework built in; use
+it! Take a look at existing tests for inspiration. Run the full test suite on
+your branch before submitting a pull request.
+
+Update the documentation when creating or modifying features. Test
+your documentation changes for clarity, concision, and correctness, as
+well as a clean documentation build.
+
+Write clean code. Universally formatted code promotes ease of writing, reading,
+and maintenance.
+
+Pull requests descriptions should be as clear as possible and include a
+reference to all the issues that they address.
+
+Pull requests must not contain commits from other users or branches.
+
+Commit messages must start with a capitalized and short summary (max. 50
+chars) written in the imperative, followed by an optional, more detailed
+explanatory text which is separated from the summary by an empty line.
+
+Code review comments may be added to your pull request. Discuss, then make the
+suggested modifications and push additional commits to your feature branch. Be
+sure to post a comment after pushing. The new commits will show up in the pull
+request automatically, but the reviewers will not be notified unless you
+comment.
+
+Commits that fix or close an issue should include a reference like `Fixes #XXX`,
+which will automatically close the issue when merged.
+
+### Sign your work
+
+The sign-off is a simple line at the end of the explanation for the
+patch, which certifies that you wrote it or otherwise have the right to
+pass it on as an open-source patch. The rules are pretty simple: if you
+can certify the below (from
+[developercertificate.org](https://developercertificate.org/)):
+
+```
+Developer Certificate of Origin
+Version 1.1
+
+Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
+660 York Street, Suite 102,
+San Francisco, CA 94110 USA
+
+Everyone is permitted to copy and distribute verbatim copies of this
+license document, but changing it is not allowed.
+
+
+Developer's Certificate of Origin 1.1
+
+By making a contribution to this project, I certify that:
+
+(a) The contribution was created in whole or in part by me and I
+ have the right to submit it under the open source license
+ indicated in the file; or
+
+(b) The contribution is based upon previous work that, to the best
+ of my knowledge, is covered under an appropriate open source
+ license and I have the right under that license to submit that
+ work with modifications, whether created in whole or in part
+ by me, under the same open source license (unless I am
+ permitted to submit under a different license), as indicated
+ in the file; or
+
+(c) The contribution was provided directly to me by some other
+ person who certified (a), (b) or (c) and I have not modified
+ it.
+
+(d) I understand and agree that this project and the contribution
+ are public and that a record of the contribution (including all
+ personal information I submit with it, including my sign-off) is
+ maintained indefinitely and may be redistributed consistent with
+ this project or the open source license(s) involved.
+```
+
+then you just add a line to every git commit message:
+
+ Signed-off-by: Joe Smith <joe.smith@email.com>
+
+using your real name (sorry, no pseudonyms or anonymous contributions.) and an
+e-mail address under which you can be reached (sorry, no github noreply e-mail
+addresses (such as username@users.noreply.github.com) or other non-reachable
+addresses are allowed).
+
+On the command line you can use `git commit -s` to sign off the commit.
+
+### How can I become a maintainer?
+
+* Step 1: learn the component inside out
+* Step 2: make yourself useful by contributing code, bugfixes, support etc.
+* Step 3: volunteer on [the discussion group](https://github.com/openhab/openhab-addons/issues?labels=question&page=1&state=open)
+
+Don't forget: being a maintainer is a time investment. Make sure you will have time to make yourself available.
+You don't have to be a maintainer to make a difference on the project!
+
+## Community Guidelines
+
+We want to keep the openHAB community awesome, growing and collaborative. We
+need your help to keep it that way. To help with this we have come up with some
+general guidelines for the community as a whole:
+
+* Be nice: Be courteous, respectful and polite to fellow community members: no
+ regional, racial, gender, or other abuse will be tolerated. We like nice people
+ way better than mean ones!
+
+* Encourage diversity and participation: Make everyone in our community
+ feel welcome, regardless of their background and the extent of their
+ contributions, and do everything possible to encourage participation in
+ our community.
+
+* Keep it legal: Basically, don't get us in trouble. Share only content that
+ you own, do not share private or sensitive information, and don't break the
+ law.
+
+* Stay on topic: Make sure that you are posting to the correct channel
+ and avoid off-topic discussions. Remember when you update an issue or
+ respond to an email you are potentially sending to a large number of
+ people. Please consider this before you update. Also remember that
+ nobody likes spam.
+
--- /dev/null
+Eclipse Public License - v 2.0
+
+ THE ACCOMPANYING PROGRAM IS PROVIDED UNDER THE TERMS OF THIS ECLIPSE
+ PUBLIC LICENSE ("AGREEMENT"). ANY USE, REPRODUCTION OR DISTRIBUTION
+ OF THE PROGRAM CONSTITUTES RECIPIENT'S ACCEPTANCE OF THIS AGREEMENT.
+
+1. DEFINITIONS
+
+"Contribution" means:
+
+ a) in the case of the initial Contributor, the initial content
+ Distributed under this Agreement, and
+
+ b) in the case of each subsequent Contributor:
+ i) changes to the Program, and
+ ii) additions to the Program;
+ where such changes and/or additions to the Program originate from
+ and are Distributed by that particular Contributor. A Contribution
+ "originates" from a Contributor if it was added to the Program by
+ such Contributor itself or anyone acting on such Contributor's behalf.
+ Contributions do not include changes or additions to the Program that
+ are not Modified Works.
+
+"Contributor" means any person or entity that Distributes the Program.
+
+"Licensed Patents" mean patent claims licensable by a Contributor which
+are necessarily infringed by the use or sale of its Contribution alone
+or when combined with the Program.
+
+"Program" means the Contributions Distributed in accordance with this
+Agreement.
+
+"Recipient" means anyone who receives the Program under this Agreement
+or any Secondary License (as applicable), including Contributors.
+
+"Derivative Works" shall mean any work, whether in Source Code or other
+form, that is based on (or derived from) the Program and for which the
+editorial revisions, annotations, elaborations, or other modifications
+represent, as a whole, an original work of authorship.
+
+"Modified Works" shall mean any work in Source Code or other form that
+results from an addition to, deletion from, or modification of the
+contents of the Program, including, for purposes of clarity any new file
+in Source Code form that contains any contents of the Program. Modified
+Works shall not include works that contain only declarations,
+interfaces, types, classes, structures, or files of the Program solely
+in each case in order to link to, bind by name, or subclass the Program
+or Modified Works thereof.
+
+"Distribute" means the acts of a) distributing or b) making available
+in any manner that enables the transfer of a copy.
+
+"Source Code" means the form of a Program preferred for making
+modifications, including but not limited to software source code,
+documentation source, and configuration files.
+
+"Secondary License" means either the GNU General Public License,
+Version 2.0, or any later versions of that license, including any
+exceptions or additional permissions as identified by the initial
+Contributor.
+
+2. GRANT OF RIGHTS
+
+ a) Subject to the terms of this Agreement, each Contributor hereby
+ grants Recipient a non-exclusive, worldwide, royalty-free copyright
+ license to reproduce, prepare Derivative Works of, publicly display,
+ publicly perform, Distribute and sublicense the Contribution of such
+ Contributor, if any, and such Derivative Works.
+
+ b) Subject to the terms of this Agreement, each Contributor hereby
+ grants Recipient a non-exclusive, worldwide, royalty-free patent
+ license under Licensed Patents to make, use, sell, offer to sell,
+ import and otherwise transfer the Contribution of such Contributor,
+ if any, in Source Code or other form. This patent license shall
+ apply to the combination of the Contribution and the Program if, at
+ the time the Contribution is added by the Contributor, such addition
+ of the Contribution causes such combination to be covered by the
+ Licensed Patents. The patent license shall not apply to any other
+ combinations which include the Contribution. No hardware per se is
+ licensed hereunder.
+
+ c) Recipient understands that although each Contributor grants the
+ licenses to its Contributions set forth herein, no assurances are
+ provided by any Contributor that the Program does not infringe the
+ patent or other intellectual property rights of any other entity.
+ Each Contributor disclaims any liability to Recipient for claims
+ brought by any other entity based on infringement of intellectual
+ property rights or otherwise. As a condition to exercising the
+ rights and licenses granted hereunder, each Recipient hereby
+ assumes sole responsibility to secure any other intellectual
+ property rights needed, if any. For example, if a third party
+ patent license is required to allow Recipient to Distribute the
+ Program, it is Recipient's responsibility to acquire that license
+ before distributing the Program.
+
+ d) Each Contributor represents that to its knowledge it has
+ sufficient copyright rights in its Contribution, if any, to grant
+ the copyright license set forth in this Agreement.
+
+ e) Notwithstanding the terms of any Secondary License, no
+ Contributor makes additional grants to any Recipient (other than
+ those set forth in this Agreement) as a result of such Recipient's
+ receipt of the Program under the terms of a Secondary License
+ (if permitted under the terms of Section 3).
+
+3. REQUIREMENTS
+
+3.1 If a Contributor Distributes the Program in any form, then:
+
+ a) the Program must also be made available as Source Code, in
+ accordance with section 3.2, and the Contributor must accompany
+ the Program with a statement that the Source Code for the Program
+ is available under this Agreement, and informs Recipients how to
+ obtain it in a reasonable manner on or through a medium customarily
+ used for software exchange; and
+
+ b) the Contributor may Distribute the Program under a license
+ different than this Agreement, provided that such license:
+ i) effectively disclaims on behalf of all other Contributors all
+ warranties and conditions, express and implied, including
+ warranties or conditions of title and non-infringement, and
+ implied warranties or conditions of merchantability and fitness
+ for a particular purpose;
+
+ ii) effectively excludes on behalf of all other Contributors all
+ liability for damages, including direct, indirect, special,
+ incidental and consequential damages, such as lost profits;
+
+ iii) does not attempt to limit or alter the recipients' rights
+ in the Source Code under section 3.2; and
+
+ iv) requires any subsequent distribution of the Program by any
+ party to be under a license that satisfies the requirements
+ of this section 3.
+
+3.2 When the Program is Distributed as Source Code:
+
+ a) it must be made available under this Agreement, or if the
+ Program (i) is combined with other material in a separate file or
+ files made available under a Secondary License, and (ii) the initial
+ Contributor attached to the Source Code the notice described in
+ Exhibit A of this Agreement, then the Program may be made available
+ under the terms of such Secondary Licenses, and
+
+ b) a copy of this Agreement must be included with each copy of
+ the Program.
+
+3.3 Contributors may not remove or alter any copyright, patent,
+trademark, attribution notices, disclaimers of warranty, or limitations
+of liability ("notices") contained within the Program from any copy of
+the Program which they Distribute, provided that Contributors may add
+their own appropriate notices.
+
+4. COMMERCIAL DISTRIBUTION
+
+Commercial distributors of software may accept certain responsibilities
+with respect to end users, business partners and the like. While this
+license is intended to facilitate the commercial use of the Program,
+the Contributor who includes the Program in a commercial product
+offering should do so in a manner which does not create potential
+liability for other Contributors. Therefore, if a Contributor includes
+the Program in a commercial product offering, such Contributor
+("Commercial Contributor") hereby agrees to defend and indemnify every
+other Contributor ("Indemnified Contributor") against any losses,
+damages and costs (collectively "Losses") arising from claims, lawsuits
+and other legal actions brought by a third party against the Indemnified
+Contributor to the extent caused by the acts or omissions of such
+Commercial Contributor in connection with its distribution of the Program
+in a commercial product offering. The obligations in this section do not
+apply to any claims or Losses relating to any actual or alleged
+intellectual property infringement. In order to qualify, an Indemnified
+Contributor must: a) promptly notify the Commercial Contributor in
+writing of such claim, and b) allow the Commercial Contributor to control,
+and cooperate with the Commercial Contributor in, the defense and any
+related settlement negotiations. The Indemnified Contributor may
+participate in any such claim at its own expense.
+
+For example, a Contributor might include the Program in a commercial
+product offering, Product X. That Contributor is then a Commercial
+Contributor. If that Commercial Contributor then makes performance
+claims, or offers warranties related to Product X, those performance
+claims and warranties are such Commercial Contributor's responsibility
+alone. Under this section, the Commercial Contributor would have to
+defend claims against the other Contributors related to those performance
+claims and warranties, and if a court requires any other Contributor to
+pay any damages as a result, the Commercial Contributor must pay
+those damages.
+
+5. NO WARRANTY
+
+EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT
+PERMITTED BY APPLICABLE LAW, THE PROGRAM IS PROVIDED ON AN "AS IS"
+BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR
+IMPLIED INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF
+TITLE, NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR
+PURPOSE. Each Recipient is solely responsible for determining the
+appropriateness of using and distributing the Program and assumes all
+risks associated with its exercise of rights under this Agreement,
+including but not limited to the risks and costs of program errors,
+compliance with applicable laws, damage to or loss of data, programs
+or equipment, and unavailability or interruption of operations.
+
+6. DISCLAIMER OF LIABILITY
+
+EXCEPT AS EXPRESSLY SET FORTH IN THIS AGREEMENT, AND TO THE EXTENT
+PERMITTED BY APPLICABLE LAW, NEITHER RECIPIENT NOR ANY CONTRIBUTORS
+SHALL HAVE ANY LIABILITY FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING WITHOUT LIMITATION LOST
+PROFITS), HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ARISING IN ANY WAY OUT OF THE USE OR DISTRIBUTION OF THE PROGRAM OR THE
+EXERCISE OF ANY RIGHTS GRANTED HEREUNDER, EVEN IF ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
+
+7. GENERAL
+
+If any provision of this Agreement is invalid or unenforceable under
+applicable law, it shall not affect the validity or enforceability of
+the remainder of the terms of this Agreement, and without further
+action by the parties hereto, such provision shall be reformed to the
+minimum extent necessary to make such provision valid and enforceable.
+
+If Recipient institutes patent litigation against any entity
+(including a cross-claim or counterclaim in a lawsuit) alleging that the
+Program itself (excluding combinations of the Program with other software
+or hardware) infringes such Recipient's patent(s), then such Recipient's
+rights granted under Section 2(b) shall terminate as of the date such
+litigation is filed.
+
+All Recipient's rights under this Agreement shall terminate if it
+fails to comply with any of the material terms or conditions of this
+Agreement and does not cure such failure in a reasonable period of
+time after becoming aware of such noncompliance. If all Recipient's
+rights under this Agreement terminate, Recipient agrees to cease use
+and distribution of the Program as soon as reasonably practicable.
+However, Recipient's obligations under this Agreement and any licenses
+granted by Recipient relating to the Program shall continue and survive.
+
+Everyone is permitted to copy and distribute copies of this Agreement,
+but in order to avoid inconsistency the Agreement is copyrighted and
+may only be modified in the following manner. The Agreement Steward
+reserves the right to publish new versions (including revisions) of
+this Agreement from time to time. No one other than the Agreement
+Steward has the right to modify this Agreement. The Eclipse Foundation
+is the initial Agreement Steward. The Eclipse Foundation may assign the
+responsibility to serve as the Agreement Steward to a suitable separate
+entity. Each new version of the Agreement will be given a distinguishing
+version number. The Program (including Contributions) may always be
+Distributed subject to the version of the Agreement under which it was
+received. In addition, after a new version of the Agreement is published,
+Contributor may elect to Distribute the Program (including its
+Contributions) under the new version.
+
+Except as expressly stated in Sections 2(a) and 2(b) above, Recipient
+receives no rights or licenses to the intellectual property of any
+Contributor under this Agreement, whether expressly, by implication,
+estoppel or otherwise. All rights in the Program not expressly granted
+under this Agreement are reserved. Nothing in this Agreement is intended
+to be enforceable by any entity that is not a Contributor or Recipient.
+No third-party beneficiary rights are created under this Agreement.
+
+Exhibit A - Form of Secondary Licenses Notice
+
+"This Source Code may also be made available under the following
+Secondary Licenses when the conditions for such availability set forth
+in the Eclipse Public License, v. 2.0 are satisfied: {name license(s),
+version(s), and exceptions or additional permissions here}."
+
+ Simply including a copy of this Agreement, including this Exhibit A
+ is not sufficient to license the Source Code under Secondary Licenses.
+
+ If it is not possible or desirable to put the notice in a particular
+ file, then You may include the notice in a location (such as a LICENSE
+ file in a relevant directory) where a recipient would be likely to
+ look for such a notice.
+
+ You may add additional accurate notices of copyright ownership.
--- /dev/null
+# openHAB Add-ons
+
+<img align="right" width="220" src="./logo.png" />
+
+[](https://travis-ci.com/openhab/openhab-addons)
+[](https://opensource.org/licenses/EPL-2.0)
+[](https://www.bountysource.com/teams/openhab/issues?tracker_ids=2164344)
+
+This repository contains the official set of add-ons that are implemented on top of openHAB Core APIs.
+Add-ons that got accepted in here will be maintained (e.g. adapted to new core APIs)
+by the [openHAB Add-on maintainers](https://github.com/orgs/openhab/teams/add-ons-maintainers).
+
+To get started with binding development, follow our guidelines and tutorials over at https://www.openhab.org/docs/developer.
+
+If you are interested in openHAB Core development, we invite you to come by on https://github.com/openhab/openhab-core.
+
+## Add-ons in other repositories
+
+Some add-ons are not in this repository, but still part of the official [openHAB distribution](https://github.com/openhab/openhab-distro).
+An incomplete list of other repositories follows below:
+
+* https://github.com/openhab/org.openhab.binding.zwave
+* https://github.com/openhab/org.openhab.binding.zigbee
+* https://github.com/openhab/openhab-webui
+
+## Development / Repository Organization
+
+openHAB add-ons are [Java](https://en.wikipedia.org/wiki/Java_(programming_language)) `.jar` files.
+
+The openHAB build system is based on [Maven](https://maven.apache.org/what-is-maven.html).
+The official IDE (Integrated development environment) is Eclipse.
+
+You find the following repository structure:
+
+```
+.
++-- bom Maven buildsystem: Bill of materials
+| +-- openhab-addons Lists all extensions for other repos to reference them
+| +-- ... Other boms
+|
++-- bundles Official openHAB extensions
+| +-- org.openhab.binding.airquality
+| +-- org.openhab.binding.astro
+| +-- ...
+|
++-- features Part of the runtime dependency resolver ("Karaf features")
+|
++-- itests Integration tests. Those tests require parts of the framework to run.
+| +-- org.openhab.binding.astro.tests
+| +-- org.openhab.binding.avmfritz.tests
+| +-- ...
+|
++-- src/etc Auxilary buildsystem files: The license header for automatic checks for example
++-- tools Static code analyser instructions
+|
++-- CODEOWNERS This file assigns people to directories so that they are informed if a pull-request
+ would modify their add-ons.
+```
+
+### Command line build
+
+To build all add-ons from the command-line, type in:
+
+`mvn clean install`
+
+Optionally you can skip tests (`-DskipTests`) or skip some static analysis (`-DskipChecks`).
+This does improve the build time but could hide problems in your code.
+For binding development you want to run that command without skipping checks and tests.
+To check if your code is following the [code style](https://www.openhab.org/docs/developer/guidelines.html#b-code-formatting-rules-style) run `mvn spotless:check`.
+If Maven prints `[INFO] Spotless check skipped` then run `mvn spotless:check -Dspotless.check.skip=false` instead as the check is not mandatory yet.
+To reformat you code run `mvn spotless:apply`.
+
+Subsequent calls can include the `-o` for offline as in: `mvn clean install -DskipChecks -o` which will be a bit faster.
+
+For integration tests you might need to run: `mvn clean install -DwithResolver -DskipChecks`
+
+You find a generated `.jar` file per bundle in the respective bundle `/target` directory.
+
+### How to develop via an Integrated Development Environment (IDE)
+
+We have assembled some step-by-step guides for different IDEs on our developer documentation website:
+
+https://www.openhab.org/docs/developer/#setup-the-development-environment
+
+Happy coding!
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.core.bom.openhab-addons</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.reactor.bom</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.bom.openhab-addons</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: BOM :: openHAB Add-ons</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.binding.nest</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.dynamodb</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.influxdb</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.jdbc</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.jpa</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.mapdb</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.mongodb</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.rrd4j</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.voice.googletts</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ </dependencies>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.addons.bom.openhab-core-index</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.reactor.bom</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.bom.openhab-core-index</artifactId>
+
+ <name>openHAB Add-ons :: BOM :: openHAB Core Index</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.openhab-core</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>compile</scope>
+ <optional>true</optional>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ </plugin>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-indexer-maven-plugin</artifactId>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons</groupId>
+ <artifactId>org.openhab.addons.reactor</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.reactor.bom</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: BOM</name>
+
+ <modules>
+ <module>runtime-index</module>
+ <module>test-index</module>
+ <module>openhab-core-index</module>
+ <module>openhab-addons</module>
+ </modules>
+
+ <build>
+ <pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-antrun-plugin</artifactId>
+ <version>1.8</version>
+ <inherited>false</inherited>
+ <executions>
+ <execution>
+ <id>create-bom</id>
+ <goals>
+ <goal>run</goal>
+ </goals>
+ <configuration>
+ <target>
+ <copy file="${basedirRoot}/../../bundles/pom.xml" overwrite="true"
+ tofile="${basedirRoot}/../../bom/openhab-addons/pom.xml"/>
+ <!-- rewrite footer -->
+ <replaceregexp file="${basedirRoot}/../../bom/openhab-addons/pom.xml"
+ match="/modules[\s\S]*dependencies>" replace="/dependencies>"/>
+ <!-- rewrite header -->
+ <replaceregexp file="${basedirRoot}/../../bom/openhab-addons/pom.xml"
+ match="\S*parent[\s\S]*modules>\S*" replace="header"/>
+ <replace file="{basedirRoot}/../../bom/openhab-addons/pom.xml">
+ <replacetoken>header</replacetoken>
+ <replacevalue><![CDATA[<parent>
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.reactor.bom</artifactId>
+ <version>${project.version}</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.bom.openhab-addons</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: BOM :: openHAB Add-ons</name>
+
+ <dependencies>]]></replacevalue>
+ </replace>
+ <!-- rewrite content -->
+ <replace file="{basedirRoot}/../../bom/openhab-addons/pom.xml">
+ <replacetoken><![CDATA[<module>]]></replacetoken>
+ <replacevalue><![CDATA[<dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>]]></replacevalue>
+ </replace>
+ <replace file="{basedirRoot}/../../bom/openhab-addons/pom.xml">
+ <replacetoken><![CDATA[</module>]]></replacetoken>
+ <replacevalue><![CDATA[</artifactId>
+ <version>@dollar{project.version}</version>
+ </dependency>]]></replacevalue>
+ </replace>
+ <replace file="{basedirRoot}/../../bom/openhab-addons/pom.xml">
+ <replacetoken>@dollar</replacetoken>
+ <replacevalue>$</replacevalue>
+ </replace>
+ </target>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+ </build>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.addons.bom.runtime-index</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.reactor.bom</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.bom.runtime-index</artifactId>
+
+ <name>openHAB Add-ons :: BOM :: Runtime Index</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.runtime</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>compile</scope>
+ <optional>true</optional>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ </plugin>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-indexer-maven-plugin</artifactId>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.addons.bom.test-index</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.reactor.bom</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.bom.test-index</artifactId>
+
+ <name>openHAB Add-ons :: BOM :: Test Index</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.test</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>compile</scope>
+ <optional>true</optional>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.test-index</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>compile</scope>
+ <optional>true</optional>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ </plugin>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-indexer-maven-plugin</artifactId>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
--- /dev/null
+#!/bin/bash
+
+set -o pipefail # exit build with error when pipes fail
+
+function prevent_timeout() {
+ local i=0
+ while [[ -e /proc/$1 ]]; do
+ # print zero width char every 3 minutes while building
+ if [[ "$i" -eq "180" ]]; then printf %b '\u200b'; i=0; else i=$((i+1)); fi
+ sleep 1
+ done
+}
+
+function print_reactor_summary() {
+ sed -ne '/\[INFO\] Reactor Summary.*:/,$ p' "$1" | sed 's/\[INFO\] //'
+}
+
+function mvnp() {
+ local command=(mvn $@)
+ exec "${command[@]}" 2>&1 | # execute, redirect stderr to stdout
+ stdbuf -o0 grep -vE "Download(ed|ing) from [a-z.]+: https:" | # filter out downloads
+ tee .build.log | # write output to log
+ stdbuf -oL grep -aE '^\[INFO\] Building .+ \[.+\]$' | # filter progress
+ stdbuf -o0 sed -uE 's/^\[INFO\] Building (.*[^ ])[ ]+\[([0-9]+\/[0-9]+)\]$/\2| \1/' | # prefix project name with progress
+ stdbuf -o0 sed -e :a -e 's/^.\{1,6\}|/ &/;ta' & # right align progress with padding
+ local pid=$!
+ prevent_timeout ${pid} &
+ wait ${pid}
+}
+
+COMMITS=${1:-"master...HEAD"}
+
+# Determine if this is a single changed addon -> Perform build with tests + integration tests and all SAT checks
+CHANGED_BUNDLE_DIR=`git diff --dirstat=files,0 ${COMMITS} bundles/ | sed 's/^[ 0-9.]\+% bundles\///g' | grep -o -P "^([^/]*)" | uniq`
+# Determine if this is a single changed itest -> Perform build with tests + integration tests and all SAT checks
+# for this we have to remove '.tests' from the folder name.
+CHANGED_ITEST_DIR=`git diff --dirstat=files,0 ${COMMITS} itests/ | sed 's/^[ 0-9.]\+% itests\///g' | sed 's/\.tests\///g' | uniq`
+CDIR=`pwd`
+
+# if a bundle and (optionally the linked itests) where changed build the module and its tests
+if [[ ! -z "$CHANGED_BUNDLE_DIR" && -e "bundles/$CHANGED_BUNDLE_DIR" && ( "$CHANGED_BUNDLE_DIR" == "$CHANGED_ITEST_DIR" || -z "$CHANGED_ITEST_DIR" ) ]]; then
+ CHANGED_DIR="$CHANGED_BUNDLE_DIR"
+fi
+
+# if no bundle was changed but only itests
+if [[ -z "$CHANGED_BUNDLE_DIR" ]] && [[ -e "bundles/$CHANGED_ITEST_DIR" ]]; then
+ CHANGED_DIR="$CHANGED_ITEST_DIR"
+fi
+
+if [[ ! -z "$CHANGED_DIR" ]] && [[ -e "bundles/$CHANGED_DIR" ]]; then
+ echo "Single addon pull request: Building $CHANGED_DIR"
+ echo "MAVEN_OPTS='-Xms1g -Xmx2g -Dorg.slf4j.simpleLogger.log.org.openhab.tools.analysis.report.ReportUtility=DEBUG -Dorg.slf4j.simpleLogger.defaultLogLevel=WARN'" > ~/.mavenrc
+ ARTIFACT_ID=$(mvn -f bundles/${CHANGED_DIR}/pom.xml help:evaluate -Dexpression=project.artifactId -q -DforceStdout)
+ mvn clean install -B -am -pl ":$ARTIFACT_ID" 2>&1 |
+ stdbuf -o0 grep -vE "Download(ed|ing) from [a-z.]+: https:" | # Filter out Download(s)
+ stdbuf -o0 grep -v "target/code-analysis" | # filter out some debug code from reporting utility
+ tee ${CDIR}/.build.log
+ if [[ $? -ne 0 ]]; then
+ exit 1
+ fi
+
+ # add the postfix to make sure we actually find the correct itest
+ if [[ -e "itests/$CHANGED_DIR.tests" ]]; then
+ echo "Single addon pull request: Building itest $CHANGED_DIR"
+ cd "itests/$CHANGED_DIR.tests"
+ mvn clean install -B 2>&1 |
+ stdbuf -o0 grep -vE "Download(ed|ing) from [a-z.]+: https:" | # Filter out Download(s)
+ stdbuf -o0 grep -v "target/code-analysis" | # filter out some debug code from reporting utility
+ tee -a ${CDIR}/.build.log
+ if [[ $? -ne 0 ]]; then
+ exit 1
+ fi
+ fi
+else
+ echo "Build all"
+ echo "MAVEN_OPTS='-Xms1g -Xmx2g -Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn'" > ~/.mavenrc
+ mvnp clean install -B -DskipChecks=true
+ if [[ $? -eq 0 ]]; then
+ print_reactor_summary .build.log
+ else
+ tail -n 1000 .build.log
+ exit 1
+ fi
+fi
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
+ https://maven.apache.org/xsd/settings-1.0.0.xsd">
+ <profiles>
+ <profile>
+ <id>openHAB-snapshots</id>
+ <repositories>
+ <repository>
+ <id>archetype</id>
+ <url>https://openhab.jfrog.io/openhab/libs-snapshot</url>
+ </repository>
+ </repositories>
+ </profile>
+ </profiles>
+ <activeProfiles>
+ <activeProfile>openHAB-snapshots</activeProfile>
+ </activeProfiles>
+</settings>
--- /dev/null
+@echo off\r
+\r
+SETLOCAL\r
+SET ARGC=0\r
+\r
+FOR %%x IN (%*) DO SET /A ARGC+=1\r
+\r
+IF %ARGC% NEQ 3 (\r
+ echo Usage: %0 BindingIdInCamelCase Author GithubUser\r
+ exit /B 1\r
+)\r
+\r
+SET OpenhabVersion="3.0.0-SNAPSHOT"\r
+\r
+SET BindingIdInCamelCase=%~1\r
+SET BindingIdInLowerCase=%BindingIdInCamelCase%\r
+SET Author=%~2\r
+SET GithubUser=%~3\r
+\r
+call :LoCase BindingIdInLowerCase\r
+\r
+call mvn -s archetype-settings.xml archetype:generate -N -DarchetypeGroupId=org.openhab.core.tools.archetypes -DarchetypeArtifactId=org.openhab.core.tools.archetypes.binding -DarchetypeVersion=%OpenhabVersion% -DgroupId=org.openhab.binding -DartifactId=org.openhab.binding.%BindingIdInLowerCase% -Dpackage=org.openhab.binding.%BindingIdInLowerCase% -Dversion=%OpenhabVersion% -DbindingId=%BindingIdInLowerCase% -DbindingIdCamelCase=%BindingIdInCamelCase% -DvendorName=openHAB -Dnamespace=org.openhab -Dauthor="%Author%" -DgithubUser="%GithubUser%"\r
+\r
+COPY ..\src\etc\NOTICE org.openhab.binding.%BindingIdInLowerCase%\\r
+\r
+(SET BindingIdInLowerCase=)\r
+(SET BindingIdInCamelCase=)\r
+(SET Author=)\r
+(SET GithubUser=)\r
+\r
+GOTO:EOF\r
+\r
+\r
+:LoCase\r
+:: Subroutine to convert a variable VALUE to all lower case.\r
+:: The argument for this subroutine is the variable NAME.\r
+FOR %%i IN ("A=a" "B=b" "C=c" "D=d" "E=e" "F=f" "G=g" "H=h" "I=i" "J=j" "K=k" "L=l" "M=m" "N=n" "O=o" "P=p" "Q=q" "R=r" "S=s" "T=t" "U=u" "V=v" "W=w" "X=x" "Y=y" "Z=z") DO CALL SET "%1=%%%1:%%~i%%"\r
+GOTO:EOF\r
+\r
+ENDLOCAL\r
--- /dev/null
+#!/bin/bash
+
+[ $# -lt 3 ] && { echo "Usage: $0 <BindingIdInCamelCase> <Author> <GitHub Username>"; exit 1; }
+
+openHABVersion=3.0.0-SNAPSHOT
+
+camelcaseId=$1
+id=`echo $camelcaseId | tr '[:upper:]' '[:lower:]'`
+
+author=$2
+githubUser=$3
+
+mvn -s archetype-settings.xml archetype:generate -N \
+ -DarchetypeGroupId=org.openhab.core.tools.archetypes \
+ -DarchetypeArtifactId=org.openhab.core.tools.archetypes.binding \
+ -DarchetypeVersion=$openHABVersion \
+ -DgroupId=org.openhab.binding \
+ -DartifactId=org.openhab.binding.$id \
+ -Dpackage=org.openhab.binding.$id \
+ -Dversion=$openHABVersion \
+ -DbindingId=$id \
+ -DbindingIdCamelCase=$camelcaseId \
+ -DvendorName=openHAB \
+ -Dnamespace=org.openhab \
+ -Dauthor="$author" \
+ -DgithubUser="$githubUser"
+
+directory="org.openhab.binding.$id/"
+
+cp ../src/etc/NOTICE "$directory"
+
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.binding.nest</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# Nest Binding
+
+The Nest binding integrates devices by [Nest](https://nest.com) using the [Nest API](https://developers.nest.com/documentation/cloud/get-started) (REST).
+
+Because the Nest API runs on Nest's servers a connection with the Internet is required for sending and receiving information.
+The binding uses HTTPS to connect to the Nest API using ports 443 and 9553. Make sure outbound connections to these ports are not blocked by a firewall.
+
+> Note: This binding can only be used with Nest devices if you have an existing Nest developer account signed up for the Works with Nest (WWN) program.
+New integrations using the WWN program are no longer accepted because WWN is being retired.
+To keep using this binding do **NOT** migrate your Nest Account to a Google Account.
+For more information see [What's happening at Nest?](https://nest.com/whats-happening/).
+
+## Supported Things
+
+The table below lists the Nest binding thing types:
+
+| Things | Description | Thing Type |
+|-----------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------|----------------|
+| Nest Account | An account for using the Nest REST API | account |
+| Nest Cam (Indoor, IQ, Outdoor), Dropcam | A Nest Cam registered with your account | camera |
+| Nest Protect | The smoke detector/Nest Protect for the account | smoke_detector |
+| Structure | The Nest structure defines the house the account has setup on Nest. You will only have more than one structure if you have more than one house | structure |
+| Nest Thermostat (E) | A Thermostat to control the various aspects of the house's HVAC system | thermostat |
+
+## Authorization
+
+The Nest API uses OAuth for authorization.
+Therefore the binding needs some authorization parameters before it can access your Nest account via the Nest API.
+
+To get these authorization parameters you first need to sign up as a [Nest Developer](https://developer.nest.com) and [register a new Product](https://developer.nest.com/products/new) (free and instant).
+
+While registering a new Product (on the Product Details page) make sure to:
+
+* Leave both "OAuth Redirect URI" fields empty to enable PIN-based authorization.
+* Grant all the permissions you intend to use. When in doubt, enable the permission because the binding needs to be reauthorized when permissions change at a later time.
+
+After creating the Product, your browser shows the Product Overview page.
+This page contains the **Product ID** and **Product Secret** authorization parameters that are used by the binding.
+Take note of both parameters or keep this page open in a browser tab.
+Now copy and paste the "Authorization URL" in a new browser tab.
+Accept the permissions and you will be presented the **Pincode** authorization parameter that is also used by the binding.
+
+You can return to the Product Overview page at a later time by opening the [Products](https://console.developers.nest.com/products) page and selecting your Product.
+
+## Discovery
+
+The binding will discover all Nest Things from your account when you add and configure a "Nest Account" Thing.
+See the Authorization paragraph above for details on how to obtain the Product ID, Product Secret and Pincode configuration parameters.
+
+Once the binding has successfully authorized with the Nest API, it obtains an Access Token using the Pincode.
+The configured Pincode is cleared because it can only be used once.
+The obtained Access Token is saved as an advanced configuration parameter of the "Nest Account".
+
+You can reuse an Access Token for authorization but not the Pincode.
+A new Pincode can again be generated via the "Authorization URL" (see Authorization paragraph).
+
+## Channels
+
+### Account Channels
+
+The account Thing Type does not have any channels.
+
+### Camera Channels
+
+**Camera group channels**
+
+Information about the camera.
+
+| Channel Type ID | Item Type | Description | Read Write |
+|-----------------------|-----------|---------------------------------------------------|:----------:|
+| app_url | String | The app URL to see the camera | R |
+| audio_input_enabled | Switch | If the audio input is currently enabled | R |
+| last_online_change | DateTime | Timestamp of the last online status change | R |
+| public_share_enabled | Switch | If public sharing is currently enabled | R |
+| public_share_url | String | The URL to see the public share of the camera | R |
+| snapshot_url | String | The URL to use for a snapshot of the video stream | R |
+| streaming | Switch | If the camera is currently streaming | R/W |
+| video_history_enabled | Switch | If the video history is currently enabled | R |
+| web_url | String | The web URL to see the camera | R |
+
+**Last event group channels**
+
+Information about the last camera event (requires Nest Aware subscription).
+
+| Channel Type ID | Item Type | Description | Read Write |
+|--------------------|-----------|------------------------------------------------------------------------------------|:----------:|
+| activity_zones | String | Identifiers for activity zones that detected the event (comma separated) | R |
+| animated_image_url | String | The URL showing an animated image for the camera event | R |
+| app_url | String | The app URL for the camera event, allows you to see the camera event in an app | R |
+| end_time | DateTime | Timestamp when the camera event ended | R |
+| has_motion | Switch | If motion was detected in the camera event | R |
+| has_person | Switch | If a person was detected in the camera event | R |
+| has_sound | Switch | If sound was detected in the camera event | R |
+| image_url | String | The URL showing an image for the camera event | R |
+| start_time | DateTime | Timestamp when the camera event started | R |
+| urls_expire_time | DateTime | Timestamp when the camera event URLs expire | R |
+| web_url | String | The web URL for the camera event, allows you to see the camera event in a web page | R |
+
+### Smoke Detector Channels
+
+| Channel Type ID | Item Type | Description | Read Write |
+|-----------------------|-----------|-----------------------------------------------------------------------------------|:----------:|
+| co_alarm_state | String | The carbon monoxide alarm state of the Nest Protect (OK, EMERGENCY, WARNING) | R |
+| last_connection | DateTime | Timestamp of the last successful interaction with Nest | R |
+| last_manual_test_time | DateTime | Timestamp of the last successful manual test | R |
+| low_battery | Switch | Reports whether the battery of the Nest protect is low (if it is battery powered) | R |
+| manual_test_active | Switch | Manual test active at the moment | R |
+| smoke_alarm_state | String | The smoke alarm state of the Nest Protect (OK, EMERGENCY, WARNING) | R |
+| ui_color_state | String | The current color of the ring on the smoke detector (GRAY, GREEN, YELLOW, RED) | R |
+
+### Structure Channels
+
+| Channel Type ID | Item Type | Description | Read Write |
+|------------------------------|-----------|--------------------------------------------------------------------------------------------------------|:----------:|
+| away | String | Away state of the structure (HOME, AWAY) | R/W |
+| country_code | String | Country code of the structure ([ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2)) | R |
+| co_alarm_state | String | Carbon Monoxide alarm state (OK, EMERGENCY, WARNING) | R |
+| eta_begin | DateTime | Estimated time of arrival at home, will setup the heat to turn on and be warm | R |
+| peak_period_end_time | DateTime | Peak period end for the Rush Hour Rewards program | R |
+| peak_period_start_time | DateTime | Peak period start for the Rush Hour Rewards program | R |
+| postal_code | String | Postal code of the structure | R |
+| rush_hour_rewards_enrollment | Switch | If rush hour rewards system is enabled or not | R |
+| security_state | String | Security state of the structure (OK, DETER) | R |
+| smoke_alarm_state | String | Smoke alarm state (OK, EMERGENCY, WARNING) | R |
+| time_zone | String | The time zone for the structure ([IANA time zone format](https://www.iana.org/time-zones)) | R |
+
+### Thermostat Channels
+
+| Channel Type ID | Item Type | Description | Read Write |
+|-----------------------------|----------------------|----------------------------------------------------------------------------------------|:----------:|
+| can_cool | Switch | If the thermostat can actually turn on cooling | R |
+| can_heat | Switch | If the thermostat can actually turn on heating | R |
+| eco_max_set_point | Number:Temperature | The eco range max set point temperature | R |
+| eco_min_set_point | Number:Temperature | The eco range min set point temperature | R |
+| fan_timer_active | Switch | If the fan timer is engaged | R/W |
+| fan_timer_duration | Number:Time | Length of time that the fan is set to run (15, 30, 45, 60, 120, 240, 480, 960 minutes) | R/W |
+| fan_timer_timeout | DateTime | Timestamp when the fan stops running | R |
+| has_fan | Switch | If the thermostat can control the fan | R |
+| has_leaf | Switch | If the thermostat is currently in a leaf mode | R |
+| humidity | Number:Dimensionless | Indicates the current relative humidity | R |
+| last_connection | DateTime | Timestamp of the last successful interaction with Nest | R |
+| locked | Switch | If the thermostat has the temperature locked to only be within a set range | R |
+| locked_max_set_point | Number:Temperature | The locked range max set point temperature | R |
+| locked_min_set_point | Number:Temperature | The locked range min set point temperature | R |
+| max_set_point | Number:Temperature | The max set point temperature | R/W |
+| min_set_point | Number:Temperature | The min set point temperature | R/W |
+| mode | String | Current mode of the Nest thermostat (HEAT, COOL, HEAT_COOL, ECO, OFF) | R/W |
+| previous_mode | String | The previous mode of the Nest thermostat (HEAT, COOL, HEAT_COOL, ECO, OFF) | R |
+| state | String | The active state of the Nest thermostat (HEATING, COOLING, OFF) | R |
+| temperature | Number:Temperature | Current temperature | R |
+| time_to_target | Number:Time | Time left to the target temperature approximately | R |
+| set_point | Number:Temperature | The set point temperature | R/W |
+| sunlight_correction_active | Switch | If sunlight correction is active | R |
+| sunlight_correction_enabled | Switch | If sunlight correction is enabled | R |
+| using_emergency_heat | Switch | If the system is currently using emergency heat | R |
+
+Note that the Nest API rounds Thermostat values so they will differ from what shows up in the Nest App.
+The Nest API applies the following rounding:
+
+* degrees Celsius to 0.5 degrees
+* degrees Fahrenheit to whole degrees
+* humidity to 5%
+
+## Example
+
+You can use the discovery functionality of the binding to obtain the deviceId and structureId values for defining Nest things in files.
+
+Another way to get the deviceId and structureId values is by querying the Nest API yourself. First [obtain an Access Token](https://developers.nest.com/documentation/cloud/sample-code-auth) (or use the Access Token obtained by the binding).
+Then use it with one of the [API Read Examples](https://developers.nest.com/documentation/cloud/how-to-read-data).
+
+### demo.things:
+
+```
+Bridge nest:account:demo_account [ productId="8fdf9885-ca07-4252-1aa3-f3d5ca9589e0", productSecret="QITLR3iyUlWaj9dbvCxsCKp4f", accessToken="c.6rse1xtRk2UANErcY0XazaqPHgbvSSB6owOrbZrZ6IXrmqhsr9QTmcfaiLX1l0ULvlI5xLp01xmKeiojHqozLQbNM8yfITj1LSdK28zsUft1aKKH2mDlOeoqZKBdVIsxyZk4orH0AvKEZ5aY" ] {
+ camera fish_cam [ deviceId="qw0NNE8ruxA9AGJkTaFH3KeUiJaONWKiH9Gh3RwwhHClonIexTtufQ" ]
+ smoke_detector hallway_smoke [ deviceId="Tzvibaa3lLKnHpvpi9OQeCI_z5rfkBAV" ]
+ structure home [ structureId="20wKjydArmMV3kOluTA7JRcZg8HKBzTR-G_2nRXuIN1Bd6laGLOJQw" ]
+ thermostat living_thermostat [ deviceId="ZqAKzSv6TO6PjBnOCXf9LSI_z5rfkBAV" ]
+}
+```
+
+### demo.items:
+
+
+```
+/* Camera */
+String Cam_App_URL "App URL [%s]" { channel="nest:camera:demo_account:fish_cam:camera#app_url" }
+Switch Cam_Audio_Input_Enabled "Audio Input Enabled" { channel="nest:camera:demo_account:fish_cam:camera#audio_input_enabled" }
+DateTime Cam_Last_Online_Change "Last Online Change [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:camera:demo_account:fish_cam:camera#last_online_change" }
+String Cam_Snapshot_URL "Snapshot URL [%s]" { channel="nest:camera:demo_account:fish_cam:camera#snapshot_url" }
+Switch Cam_Streaming "Streaming" { channel="nest:camera:demo_account:fish_cam:camera#streaming" }
+Switch Cam_Public_Share_Enabled "Public Share Enabled" { channel="nest:camera:demo_account:fish_cam:camera#public_share_enabled" }
+String Cam_Public_Share_URL "Public Share URL [%s]" { channel="nest:camera:demo_account:fish_cam:camera#public_share_url" }
+Switch Cam_Video_History_Enabled "Video History Enabled" { channel="nest:camera:demo_account:fish_cam:camera#video_history_enabled" }
+String Cam_Web_URL "Web URL [%s]" { channel="nest:camera:demo_account:fish_cam:camera#web_url" }
+String Cam_LE_Activity_Zones "Last Event Activity Zones [%s]" { channel="nest:camera:demo_account:fish_cam:last_event#activity_zones" }
+String Cam_LE_Animated_Image_URL "Last Event Animated Image URL [%s]" { channel="nest:camera:demo_account:fish_cam:last_event#animated_image_url" }
+String Cam_LE_App_URL "Last Event App URL [%s]" { channel="nest:camera:demo_account:fish_cam:last_event#app_url" }
+DateTime Cam_LE_End_Time "Last Event End Time [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:camera:demo_account:fish_cam:last_event#end_time" }
+Switch Cam_LE_Has_Motion "Last Event Has Motion" { channel="nest:camera:demo_account:fish_cam:last_event#has_motion" }
+Switch Cam_LE_Has_Person "Last Event Has Person" { channel="nest:camera:demo_account:fish_cam:last_event#has_person" }
+Switch Cam_LE_Has_Sound "Last Event Has Sound" { channel="nest:camera:demo_account:fish_cam:last_event#has_sound" }
+String Cam_LE_Image_URL "Last Event Image URL [%s]" { channel="nest:camera:demo_account:fish_cam:last_event#image_url" }
+DateTime Cam_LE_Start_Time "Last Event Start Time [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:camera:demo_account:fish_cam:last_event#start_time" }
+DateTime Cam_LE_URLs_Expire_Time "Last Event URLs Expire Time [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:camera:demo_account:fish_cam:last_event#urls_expire_time" }
+String Cam_LE_Web_URL "Last Event Web URL [%s]" { channel="nest:camera:demo_account:fish_cam:last_event#web_url" }
+
+/* Smoke Detector */
+String Smoke_CO_Alarm "CO Alarm [%s]" { channel="nest:smoke_detector:demo_account:hallway_smoke:co_alarm_state" }
+Switch Smoke_Battery_Low "Battery Low" { channel="nest:smoke_detector:demo_account:hallway_smoke:low_battery" }
+Switch Smoke_Manual_Test "Manual Test" { channel="nest:smoke_detector:demo_account:hallway_smoke:manual_test_active" }
+DateTime Smoke_Last_Connection "Last Connection [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:smoke_detector:demo_account:hallway_smoke:last_connection" }
+DateTime Smoke_Last_Manual_Test "Last Manual Test [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:smoke_detector:demo_account:hallway_smoke:last_manual_test_time" }
+String Smoke_Smoke_Alarm "Smoke Alarm [%s]" { channel="nest:smoke_detector:demo_account:hallway_smoke:smoke_alarm_state" }
+String Smoke_UI_Color "UI Color [%s]" { channel="nest:smoke_detector:demo_account:hallway_smoke:ui_color_state" }
+
+/* Thermostat */
+Switch Thermostat_Can_Cool "Can Cool" { channel="nest:thermostat:demo_account:living_thermostat:can_cool" }
+Switch Thermostat_Can_Heat "Can Heat" { channel="nest:thermostat:demo_account:living_thermostat:can_heat" }
+Number:Temperature Therm_EMaxSP "Eco Max Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:eco_max_set_point" }
+Number:Temperature Therm_EMinSP "Eco Min Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:eco_min_set_point" }
+Switch Thermostat_FT_Active "Fan Timer Active" { channel="nest:thermostat:demo_account:living_thermostat:fan_timer_active" }
+Number:Time Thermostat_FT_Duration "Fan Timer Duration [%d %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:fan_timer_duration" }
+DateTime Thermostat_FT_Timeout "Fan Timer Timeout [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:thermostat:demo_account:living_thermostat:fan_timer_timeout" }
+Switch Thermostat_Has_Fan "Has Fan" { channel="nest:thermostat:demo_account:living_thermostat:has_fan" }
+Switch Thermostat_Has_Leaf "Has Leaf" { channel="nest:thermostat:demo_account:living_thermostat:has_leaf" }
+Number:Dimensionless Therm_Hum "Humidity [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:humidity" }
+DateTime Thermostat_Last_Conn "Last Connection [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:thermostat:demo_account:living_thermostat:last_connection" }
+Switch Thermostat_Locked "Locked" { channel="nest:thermostat:demo_account:living_thermostat:locked" }
+Number:Temperature Therm_LMaxSP "Locked Max Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:locked_max_set_point" }
+Number:Temperature Therm_LMinSP "Locked Min Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:locked_min_set_point" }
+Number:Temperature Therm_Max_SP "Max Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:max_set_point" }
+Number:Temperature Therm_Min_SP "Min Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:min_set_point" }
+String Thermostat_Mode "Mode [%s]" { channel="nest:thermostat:demo_account:living_thermostat:mode" }
+String Thermostat_Previous_Mode "Previous Mode [%s]" { channel="nest:thermostat:demo_account:living_thermostat:previous_mode" }
+String Thermostat_State "State [%s]" { channel="nest:thermostat:demo_account:living_thermostat:state" }
+Number:Temperature Thermostat_SP "Set Point [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:set_point" }
+Switch Thermostat_Sunlight_CA "Sunlight Correction Active" { channel="nest:thermostat:demo_account:living_thermostat:sunlight_correction_active" }
+Switch Thermostat_Sunlight_CE "Sunlight Correction Enabled" { channel="nest:thermostat:demo_account:living_thermostat:sunlight_correction_enabled" }
+Number:Temperature Therm_Temp "Temperature [%.1f %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:temperature" }
+Number:Time Therm_Time_To_Target "Time To Target [%d %unit%]" { channel="nest:thermostat:demo_account:living_thermostat:time_to_target" }
+Switch Thermostat_Using_Em_Heat "Using Emergency Heat" { channel="nest:thermostat:demo_account:living_thermostat:using_emergency_heat" }
+
+/* Structure */
+String Home_Away "Away [%s]" { channel="nest:structure:demo_account:home:away" }
+String Home_Country_Code "Country Code [%s]" { channel="nest:structure:demo_account:home:country_code" }
+String Home_CO_Alarm_State "CO Alarm State [%s]" { channel="nest:structure:demo_account:home:co_alarm_state" }
+DateTime Home_ETA "ETA [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:structure:demo_account:home:eta_begin" }
+DateTime Home_PP_End_Time "PP End Time [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:structure:demo_account:home:peak_period_end_time" }
+DateTime Home_PP_Start_Time "PP Start Time [%1$tY-%1$tm-%1$td %1$tH:%1$tM:%1$tS]" { channel="nest:structure:demo_account:home:peak_period_start_time" }
+String Home_Postal_Code "Postal Code [%s]" { channel="nest:structure:demo_account:home:postal_code" }
+Switch Home_Rush_Hour_Rewards "Rush Hour Rewards" { channel="nest:structure:demo_account:home:rush_hour_rewards_enrollment" }
+String Home_Security_State "Security State [%s]" { channel="nest:structure:demo_account:home:security_state" }
+String Home_Smoke_Alarm_State "Smoke Alarm State [%s]" { channel="nest:structure:demo_account:home:smoke_alarm_state" }
+String Home_Time_Zone "Time Zone [%s]" { channel="nest:structure:demo_account:home:time_zone" }
+```
+
+## Attribution
+
+This documentation contains parts written by John Cocula which were copied from the 1.0 Nest binding.
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.binding.nest</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Nest Binding</name>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.binding.nest-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-binding-nest" description="Nest Binding" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.binding.nest/${project.version}</bundle>
+ </feature>
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal;
+
+import java.time.Duration;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.thing.ThingTypeUID;
+
+/**
+ * The {@link NestBindingConstants} class defines common constants, which are
+ * used across the whole binding.
+ *
+ * @author David Bennett - Initial contribution
+ */
+@NonNullByDefault
+public class NestBindingConstants {
+
+ public static final String BINDING_ID = "nest";
+
+ /** The URL to use to connect to Nest with. */
+ public static final String NEST_URL = "https://developer-api.nest.com";
+
+ /** The URL to get the access token when talking to Nest. */
+ public static final String NEST_ACCESS_TOKEN_URL = "https://api.home.nest.com/oauth2/access_token";
+
+ /** The path to set values on the thermostat when talking to Nest. */
+ public static final String NEST_THERMOSTAT_UPDATE_PATH = "/devices/thermostats/";
+
+ /** The path to set values on the structure when talking to Nest. */
+ public static final String NEST_STRUCTURE_UPDATE_PATH = "/structures/";
+
+ /** The path to set values on the camera when talking to Nest. */
+ public static final String NEST_CAMERA_UPDATE_PATH = "/devices/cameras/";
+
+ /** The path to set values on the camera when talking to Nest. */
+ public static final String NEST_SMOKE_ALARM_UPDATE_PATH = "/devices/smoke_co_alarms/";
+
+ /** The JSON content type used when talking to Nest. */
+ public static final String JSON_CONTENT_TYPE = "application/json";
+
+ /** To keep the streaming REST connection alive Nest sends every 30 seconds a message. */
+ public static final long KEEP_ALIVE_MILLIS = Duration.ofSeconds(30).toMillis();
+
+ /** To avoid API throttling errors (429 Too Many Requests) Nest recommends making at most one call per minute. */
+ public static final int MIN_SECONDS_BETWEEN_API_CALLS = 60;
+
+ // List of all Thing Type UIDs
+ public static final ThingTypeUID THING_TYPE_THERMOSTAT = new ThingTypeUID(BINDING_ID, "thermostat");
+ public static final ThingTypeUID THING_TYPE_CAMERA = new ThingTypeUID(BINDING_ID, "camera");
+ public static final ThingTypeUID THING_TYPE_SMOKE_DETECTOR = new ThingTypeUID(BINDING_ID, "smoke_detector");
+ public static final ThingTypeUID THING_TYPE_BRIDGE = new ThingTypeUID(BINDING_ID, "account");
+ public static final ThingTypeUID THING_TYPE_STRUCTURE = new ThingTypeUID(BINDING_ID, "structure");
+
+ // List of all channel group prefixes
+ public static final String CHANNEL_GROUP_CAMERA_PREFIX = "camera#";
+ public static final String CHANNEL_GROUP_LAST_EVENT_PREFIX = "last_event#";
+
+ // List of all Channel IDs
+ // read only channels (common)
+ public static final String CHANNEL_LAST_CONNECTION = "last_connection";
+
+ // read/write channels (thermostat)
+ public static final String CHANNEL_MODE = "mode";
+ public static final String CHANNEL_SET_POINT = "set_point";
+ public static final String CHANNEL_MAX_SET_POINT = "max_set_point";
+ public static final String CHANNEL_MIN_SET_POINT = "min_set_point";
+ public static final String CHANNEL_FAN_TIMER_ACTIVE = "fan_timer_active";
+ public static final String CHANNEL_FAN_TIMER_DURATION = "fan_timer_duration";
+
+ // read only channels (thermostat)
+ public static final String CHANNEL_ECO_MAX_SET_POINT = "eco_max_set_point";
+ public static final String CHANNEL_ECO_MIN_SET_POINT = "eco_min_set_point";
+ public static final String CHANNEL_LOCKED = "locked";
+ public static final String CHANNEL_LOCKED_MAX_SET_POINT = "locked_max_set_point";
+ public static final String CHANNEL_LOCKED_MIN_SET_POINT = "locked_min_set_point";
+ public static final String CHANNEL_TEMPERATURE = "temperature";
+ public static final String CHANNEL_HUMIDITY = "humidity";
+ public static final String CHANNEL_PREVIOUS_MODE = "previous_mode";
+ public static final String CHANNEL_STATE = "state";
+ public static final String CHANNEL_CAN_HEAT = "can_heat";
+ public static final String CHANNEL_CAN_COOL = "can_cool";
+ public static final String CHANNEL_FAN_TIMER_TIMEOUT = "fan_timer_timeout";
+ public static final String CHANNEL_HAS_FAN = "has_fan";
+ public static final String CHANNEL_HAS_LEAF = "has_leaf";
+ public static final String CHANNEL_SUNLIGHT_CORRECTION_ENABLED = "sunlight_correction_enabled";
+ public static final String CHANNEL_SUNLIGHT_CORRECTION_ACTIVE = "sunlight_correction_active";
+ public static final String CHANNEL_TIME_TO_TARGET = "time_to_target";
+ public static final String CHANNEL_USING_EMERGENCY_HEAT = "using_emergency_heat";
+
+ // read/write channels (camera)
+ public static final String CHANNEL_CAMERA_STREAMING = "camera#streaming";
+
+ // read only channels (camera)
+ public static final String CHANNEL_CAMERA_AUDIO_INPUT_ENABLED = "camera#audio_input_enabled";
+ public static final String CHANNEL_CAMERA_VIDEO_HISTORY_ENABLED = "camera#video_history_enabled";
+ public static final String CHANNEL_CAMERA_WEB_URL = "camera#web_url";
+ public static final String CHANNEL_CAMERA_APP_URL = "camera#app_url";
+ public static final String CHANNEL_CAMERA_PUBLIC_SHARE_ENABLED = "camera#public_share_enabled";
+ public static final String CHANNEL_CAMERA_PUBLIC_SHARE_URL = "camera#public_share_url";
+ public static final String CHANNEL_CAMERA_SNAPSHOT_URL = "camera#snapshot_url";
+ public static final String CHANNEL_CAMERA_LAST_ONLINE_CHANGE = "camera#last_online_change";
+
+ public static final String CHANNEL_LAST_EVENT_HAS_SOUND = "last_event#has_sound";
+ public static final String CHANNEL_LAST_EVENT_HAS_MOTION = "last_event#has_motion";
+ public static final String CHANNEL_LAST_EVENT_HAS_PERSON = "last_event#has_person";
+ public static final String CHANNEL_LAST_EVENT_START_TIME = "last_event#start_time";
+ public static final String CHANNEL_LAST_EVENT_END_TIME = "last_event#end_time";
+ public static final String CHANNEL_LAST_EVENT_URLS_EXPIRE_TIME = "last_event#urls_expire_time";
+ public static final String CHANNEL_LAST_EVENT_WEB_URL = "last_event#web_url";
+ public static final String CHANNEL_LAST_EVENT_APP_URL = "last_event#app_url";
+ public static final String CHANNEL_LAST_EVENT_IMAGE_URL = "last_event#image_url";
+ public static final String CHANNEL_LAST_EVENT_ANIMATED_IMAGE_URL = "last_event#animated_image_url";
+ public static final String CHANNEL_LAST_EVENT_ACTIVITY_ZONES = "last_event#activity_zones";
+
+ // read/write channels (smoke detector)
+
+ // read only channels (smoke detector)
+ public static final String CHANNEL_UI_COLOR_STATE = "ui_color_state";
+ public static final String CHANNEL_LOW_BATTERY = "low_battery";
+ public static final String CHANNEL_CO_ALARM_STATE = "co_alarm_state"; // Also in structure
+ public static final String CHANNEL_SMOKE_ALARM_STATE = "smoke_alarm_state"; // Also in structure
+ public static final String CHANNEL_MANUAL_TEST_ACTIVE = "manual_test_active";
+ public static final String CHANNEL_LAST_MANUAL_TEST_TIME = "last_manual_test_time";
+
+ // read/write channel (structure)
+ public static final String CHANNEL_AWAY = "away";
+
+ // read only channels (structure)
+ public static final String CHANNEL_COUNTRY_CODE = "country_code";
+ public static final String CHANNEL_POSTAL_CODE = "postal_code";
+ public static final String CHANNEL_PEAK_PERIOD_START_TIME = "peak_period_start_time";
+ public static final String CHANNEL_PEAK_PERIOD_END_TIME = "peak_period_end_time";
+ public static final String CHANNEL_TIME_ZONE = "time_zone";
+ public static final String CHANNEL_ETA_BEGIN = "eta_begin";
+ public static final String CHANNEL_RUSH_HOUR_REWARDS_ENROLLMENT = "rush_hour_rewards_enrollment";
+ public static final String CHANNEL_SECURITY_STATE = "security_state";
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal;
+
+import static java.util.stream.Collectors.toSet;
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+
+import java.util.HashMap;
+import java.util.Hashtable;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Stream;
+
+import javax.ws.rs.client.ClientBuilder;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.discovery.NestDiscoveryService;
+import org.openhab.binding.nest.internal.handler.NestBridgeHandler;
+import org.openhab.binding.nest.internal.handler.NestCameraHandler;
+import org.openhab.binding.nest.internal.handler.NestSmokeDetectorHandler;
+import org.openhab.binding.nest.internal.handler.NestStructureHandler;
+import org.openhab.binding.nest.internal.handler.NestThermostatHandler;
+import org.openhab.core.config.discovery.DiscoveryService;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingTypeUID;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.BaseThingHandlerFactory;
+import org.openhab.core.thing.binding.ThingHandler;
+import org.openhab.core.thing.binding.ThingHandlerFactory;
+import org.osgi.framework.ServiceRegistration;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Reference;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+
+/**
+ * The {@link NestHandlerFactory} is responsible for creating things and thing
+ * handlers. It also sets up the discovery service to track things from the bridge
+ * when the bridge is created.
+ *
+ * @author David Bennett - Initial contribution
+ */
+@NonNullByDefault
+@Component(service = ThingHandlerFactory.class, configurationPid = "binding.nest")
+public class NestHandlerFactory extends BaseThingHandlerFactory {
+ private static final Set<ThingTypeUID> SUPPORTED_THING_TYPES_UIDS = Stream.of(THING_TYPE_THERMOSTAT,
+ THING_TYPE_CAMERA, THING_TYPE_BRIDGE, THING_TYPE_STRUCTURE, THING_TYPE_SMOKE_DETECTOR).collect(toSet());
+
+ private final ClientBuilder clientBuilder;
+ private final SseEventSourceFactory eventSourceFactory;
+ private final Map<ThingUID, @Nullable ServiceRegistration<?>> discoveryService = new HashMap<>();
+
+ @Activate
+ public NestHandlerFactory(@Reference ClientBuilder clientBuilder,
+ @Reference SseEventSourceFactory eventSourceFactory) {
+ this.clientBuilder = clientBuilder;
+ this.eventSourceFactory = eventSourceFactory;
+ }
+
+ /**
+ * The things this factory supports creating.
+ */
+ @Override
+ public boolean supportsThingType(ThingTypeUID thingTypeUID) {
+ return SUPPORTED_THING_TYPES_UIDS.contains(thingTypeUID);
+ }
+
+ /**
+ * Creates a handler for the specific thing. THis also creates the discovery service
+ * when the bridge is created.
+ */
+ @Override
+ protected @Nullable ThingHandler createHandler(Thing thing) {
+ ThingTypeUID thingTypeUID = thing.getThingTypeUID();
+
+ if (THING_TYPE_THERMOSTAT.equals(thingTypeUID)) {
+ return new NestThermostatHandler(thing);
+ }
+
+ if (THING_TYPE_CAMERA.equals(thingTypeUID)) {
+ return new NestCameraHandler(thing);
+ }
+
+ if (THING_TYPE_STRUCTURE.equals(thingTypeUID)) {
+ return new NestStructureHandler(thing);
+ }
+
+ if (THING_TYPE_SMOKE_DETECTOR.equals(thingTypeUID)) {
+ return new NestSmokeDetectorHandler(thing);
+ }
+
+ if (THING_TYPE_BRIDGE.equals(thingTypeUID)) {
+ NestBridgeHandler handler = new NestBridgeHandler((Bridge) thing, clientBuilder, eventSourceFactory);
+ NestDiscoveryService service = new NestDiscoveryService(handler);
+ service.activate();
+ // Register the discovery service.
+ discoveryService.put(handler.getThing().getUID(),
+ bundleContext.registerService(DiscoveryService.class.getName(), service, new Hashtable<>()));
+ return handler;
+ }
+
+ return null;
+ }
+
+ /**
+ * Removes the handler for the specific thing. This also handles disabling the discovery
+ * service when the bridge is removed.
+ */
+ @Override
+ protected void removeHandler(ThingHandler thingHandler) {
+ if (thingHandler instanceof NestBridgeHandler) {
+ ServiceRegistration<?> reg = discoveryService.get(thingHandler.getThing().getUID());
+ if (reg != null) {
+ // Unregister the discovery service.
+ NestDiscoveryService service = (NestDiscoveryService) bundleContext.getService(reg.getReference());
+ service.deactivate();
+ reg.unregister();
+ discoveryService.remove(thingHandler.getThing().getUID());
+ }
+ }
+ super.removeHandler(thingHandler);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal;
+
+import java.io.Reader;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+import com.google.gson.FieldNamingPolicy;
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
+/**
+ * Utility class for sharing utility methods between objects.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+@NonNullByDefault
+public final class NestUtils {
+
+ private static final Gson GSON = new GsonBuilder().setDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'")
+ .setFieldNamingPolicy(FieldNamingPolicy.LOWER_CASE_WITH_UNDERSCORES).create();
+
+ private NestUtils() {
+ // hidden utility class constructor
+ }
+
+ public static <T> T fromJson(String json, Class<T> dataClass) {
+ return GSON.fromJson(json, dataClass);
+ }
+
+ public static <T> T fromJson(Reader reader, Class<T> dataClass) {
+ return GSON.fromJson(reader, dataClass);
+ }
+
+ public static String toJson(Object object) {
+ return GSON.toJson(object);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.config;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+
+/**
+ * The configuration for the Nest bridge, allowing it to talk to Nest.
+ *
+ * @author David Bennett - Initial contribution
+ */
+@NonNullByDefault
+public class NestBridgeConfiguration {
+ public static final String PRODUCT_ID = "productId";
+ /** Product ID from the Nest product page. */
+ public String productId = "";
+
+ public static final String PRODUCT_SECRET = "productSecret";
+ /** Product secret from the Nest product page. */
+ public String productSecret = "";
+
+ public static final String PINCODE = "pincode";
+ /** Product pincode from the Nest authorization page. */
+ public @Nullable String pincode;
+
+ public static final String ACCESS_TOKEN = "accessToken";
+ /** The access token to use once retrieved from Nest. */
+ public @Nullable String accessToken;
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.config;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * The configuration for Nest devices.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Add device configuration to allow file based configuration
+ */
+@NonNullByDefault
+public class NestDeviceConfiguration {
+ public static final String DEVICE_ID = "deviceId";
+ /** Device ID which can be retrieved with the Nest API. */
+ public String deviceId = "";
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.config;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * The configuration for structures.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Add device configuration to allow file based configuration
+ */
+@NonNullByDefault
+public class NestStructureConfiguration {
+ public static final String STRUCTURE_ID = "structureId";
+ /** Structure ID which can be retrieved with the Nest API. */
+ public String structureId = "";
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * Deals with the access token data that comes back from Nest when it is requested.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class AccessTokenData {
+
+ private String accessToken;
+ private Long expiresIn;
+
+ public String getAccessToken() {
+ return accessToken;
+ }
+
+ public Long getExpiresIn() {
+ return expiresIn;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ AccessTokenData other = (AccessTokenData) obj;
+ if (accessToken == null) {
+ if (other.accessToken != null) {
+ return false;
+ }
+ } else if (!accessToken.equals(other.accessToken)) {
+ return false;
+ }
+ if (expiresIn == null) {
+ if (other.expiresIn != null) {
+ return false;
+ }
+ } else if (!expiresIn.equals(other.expiresIn)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((accessToken == null) ? 0 : accessToken.hashCode());
+ result = prime * result + ((expiresIn == null) ? 0 : expiresIn.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("AccessTokenData [accessToken=").append(accessToken).append(", expiresIn=").append(expiresIn)
+ .append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * The data for a camera activity zone.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Extract ActivityZone object from Camera
+ */
+public class ActivityZone {
+
+ private String name;
+ private int id;
+
+ public String getName() {
+ return name;
+ }
+
+ public int getId() {
+ return id;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ActivityZone other = (ActivityZone) obj;
+ if (id != other.id) {
+ return false;
+ }
+ if (name == null) {
+ if (other.name != null) {
+ return false;
+ }
+ } else if (!name.equals(other.name)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + id;
+ result = prime * result + ((name == null) ? 0 : name.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("CameraActivityZone [name=").append(name).append(", id=").append(id).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Date;
+
+/**
+ * Default properties shared across all Nest devices.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class BaseNestDevice implements NestIdentifiable {
+
+ private String deviceId;
+ private String name;
+ private String nameLong;
+ private Date lastConnection;
+ private Boolean isOnline;
+ private String softwareVersion;
+ private String structureId;
+
+ private String whereId;
+
+ @Override
+ public String getId() {
+ return deviceId;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public String getDeviceId() {
+ return deviceId;
+ }
+
+ public Date getLastConnection() {
+ return lastConnection;
+ }
+
+ public Boolean isOnline() {
+ return isOnline;
+ }
+
+ public String getNameLong() {
+ return nameLong;
+ }
+
+ public String getSoftwareVersion() {
+ return softwareVersion;
+ }
+
+ public String getStructureId() {
+ return structureId;
+ }
+
+ public String getWhereId() {
+ return whereId;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ BaseNestDevice other = (BaseNestDevice) obj;
+ if (deviceId == null) {
+ if (other.deviceId != null) {
+ return false;
+ }
+ } else if (!deviceId.equals(other.deviceId)) {
+ return false;
+ }
+ if (isOnline == null) {
+ if (other.isOnline != null) {
+ return false;
+ }
+ } else if (!isOnline.equals(other.isOnline)) {
+ return false;
+ }
+ if (lastConnection == null) {
+ if (other.lastConnection != null) {
+ return false;
+ }
+ } else if (!lastConnection.equals(other.lastConnection)) {
+ return false;
+ }
+ if (name == null) {
+ if (other.name != null) {
+ return false;
+ }
+ } else if (!name.equals(other.name)) {
+ return false;
+ }
+ if (nameLong == null) {
+ if (other.nameLong != null) {
+ return false;
+ }
+ } else if (!nameLong.equals(other.nameLong)) {
+ return false;
+ }
+ if (softwareVersion == null) {
+ if (other.softwareVersion != null) {
+ return false;
+ }
+ } else if (!softwareVersion.equals(other.softwareVersion)) {
+ return false;
+ }
+ if (structureId == null) {
+ if (other.structureId != null) {
+ return false;
+ }
+ } else if (!structureId.equals(other.structureId)) {
+ return false;
+ }
+ if (whereId == null) {
+ if (other.whereId != null) {
+ return false;
+ }
+ } else if (!whereId.equals(other.whereId)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((deviceId == null) ? 0 : deviceId.hashCode());
+ result = prime * result + ((isOnline == null) ? 0 : isOnline.hashCode());
+ result = prime * result + ((lastConnection == null) ? 0 : lastConnection.hashCode());
+ result = prime * result + ((name == null) ? 0 : name.hashCode());
+ result = prime * result + ((nameLong == null) ? 0 : nameLong.hashCode());
+ result = prime * result + ((softwareVersion == null) ? 0 : softwareVersion.hashCode());
+ result = prime * result + ((structureId == null) ? 0 : structureId.hashCode());
+ result = prime * result + ((whereId == null) ? 0 : whereId.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("BaseNestDevice [deviceId=").append(deviceId).append(", name=").append(name)
+ .append(", nameLong=").append(nameLong).append(", lastConnection=").append(lastConnection)
+ .append(", isOnline=").append(isOnline).append(", softwareVersion=").append(softwareVersion)
+ .append(", structureId=").append(structureId).append(", whereId=").append(whereId).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Date;
+import java.util.List;
+
+/**
+ * The data for the camera.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class Camera extends BaseNestDevice {
+
+ private Boolean isStreaming;
+ private Boolean isAudioInputEnabled;
+ private Date lastIsOnlineChange;
+ private Boolean isVideoHistoryEnabled;
+ private String webUrl;
+ private String appUrl;
+ private Boolean isPublicShareEnabled;
+ private List<ActivityZone> activityZones;
+ private String publicShareUrl;
+ private String snapshotUrl;
+ private CameraEvent lastEvent;
+
+ public Boolean isStreaming() {
+ return isStreaming;
+ }
+
+ public Boolean isAudioInputEnabled() {
+ return isAudioInputEnabled;
+ }
+
+ public Date getLastIsOnlineChange() {
+ return lastIsOnlineChange;
+ }
+
+ public Boolean isVideoHistoryEnabled() {
+ return isVideoHistoryEnabled;
+ }
+
+ public String getWebUrl() {
+ return webUrl;
+ }
+
+ public String getAppUrl() {
+ return appUrl;
+ }
+
+ public Boolean isPublicShareEnabled() {
+ return isPublicShareEnabled;
+ }
+
+ public List<ActivityZone> getActivityZones() {
+ return activityZones;
+ }
+
+ public String getPublicShareUrl() {
+ return publicShareUrl;
+ }
+
+ public String getSnapshotUrl() {
+ return snapshotUrl;
+ }
+
+ public CameraEvent getLastEvent() {
+ return lastEvent;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (!super.equals(obj)) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ Camera other = (Camera) obj;
+ if (activityZones == null) {
+ if (other.activityZones != null) {
+ return false;
+ }
+ } else if (!activityZones.equals(other.activityZones)) {
+ return false;
+ }
+ if (appUrl == null) {
+ if (other.appUrl != null) {
+ return false;
+ }
+ } else if (!appUrl.equals(other.appUrl)) {
+ return false;
+ }
+ if (isAudioInputEnabled == null) {
+ if (other.isAudioInputEnabled != null) {
+ return false;
+ }
+ } else if (!isAudioInputEnabled.equals(other.isAudioInputEnabled)) {
+ return false;
+ }
+ if (isPublicShareEnabled == null) {
+ if (other.isPublicShareEnabled != null) {
+ return false;
+ }
+ } else if (!isPublicShareEnabled.equals(other.isPublicShareEnabled)) {
+ return false;
+ }
+ if (isStreaming == null) {
+ if (other.isStreaming != null) {
+ return false;
+ }
+ } else if (!isStreaming.equals(other.isStreaming)) {
+ return false;
+ }
+ if (isVideoHistoryEnabled == null) {
+ if (other.isVideoHistoryEnabled != null) {
+ return false;
+ }
+ } else if (!isVideoHistoryEnabled.equals(other.isVideoHistoryEnabled)) {
+ return false;
+ }
+ if (lastEvent == null) {
+ if (other.lastEvent != null) {
+ return false;
+ }
+ } else if (!lastEvent.equals(other.lastEvent)) {
+ return false;
+ }
+ if (lastIsOnlineChange == null) {
+ if (other.lastIsOnlineChange != null) {
+ return false;
+ }
+ } else if (!lastIsOnlineChange.equals(other.lastIsOnlineChange)) {
+ return false;
+ }
+ if (publicShareUrl == null) {
+ if (other.publicShareUrl != null) {
+ return false;
+ }
+ } else if (!publicShareUrl.equals(other.publicShareUrl)) {
+ return false;
+ }
+ if (snapshotUrl == null) {
+ if (other.snapshotUrl != null) {
+ return false;
+ }
+ } else if (!snapshotUrl.equals(other.snapshotUrl)) {
+ return false;
+ }
+ if (webUrl == null) {
+ if (other.webUrl != null) {
+ return false;
+ }
+ } else if (!webUrl.equals(other.webUrl)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = super.hashCode();
+ result = prime * result + ((activityZones == null) ? 0 : activityZones.hashCode());
+ result = prime * result + ((appUrl == null) ? 0 : appUrl.hashCode());
+ result = prime * result + ((isAudioInputEnabled == null) ? 0 : isAudioInputEnabled.hashCode());
+ result = prime * result + ((isPublicShareEnabled == null) ? 0 : isPublicShareEnabled.hashCode());
+ result = prime * result + ((isStreaming == null) ? 0 : isStreaming.hashCode());
+ result = prime * result + ((isVideoHistoryEnabled == null) ? 0 : isVideoHistoryEnabled.hashCode());
+ result = prime * result + ((lastEvent == null) ? 0 : lastEvent.hashCode());
+ result = prime * result + ((lastIsOnlineChange == null) ? 0 : lastIsOnlineChange.hashCode());
+ result = prime * result + ((publicShareUrl == null) ? 0 : publicShareUrl.hashCode());
+ result = prime * result + ((snapshotUrl == null) ? 0 : snapshotUrl.hashCode());
+ result = prime * result + ((webUrl == null) ? 0 : webUrl.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("Camera [isStreaming=").append(isStreaming).append(", isAudioInputEnabled=")
+ .append(isAudioInputEnabled).append(", lastIsOnlineChange=").append(lastIsOnlineChange)
+ .append(", isVideoHistoryEnabled=").append(isVideoHistoryEnabled).append(", webUrl=").append(webUrl)
+ .append(", appUrl=").append(appUrl).append(", isPublicShareEnabled=").append(isPublicShareEnabled)
+ .append(", activityZones=").append(activityZones).append(", publicShareUrl=").append(publicShareUrl)
+ .append(", snapshotUrl=").append(snapshotUrl).append(", lastEvent=").append(lastEvent)
+ .append(", getId()=").append(getId()).append(", getName()=").append(getName())
+ .append(", getDeviceId()=").append(getDeviceId()).append(", getLastConnection()=")
+ .append(getLastConnection()).append(", isOnline()=").append(isOnline()).append(", getNameLong()=")
+ .append(getNameLong()).append(", getSoftwareVersion()=").append(getSoftwareVersion())
+ .append(", getStructureId()=").append(getStructureId()).append(", getWhereId()=").append(getWhereId())
+ .append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Date;
+import java.util.List;
+
+/**
+ * The data for a camera event.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Extract CameraEvent object from Camera
+ * @author Wouter Born - Add equals, hashCode, toString methods
+ */
+public class CameraEvent {
+
+ private Boolean hasSound;
+ private Boolean hasMotion;
+ private Boolean hasPerson;
+ private Date startTime;
+ private Date endTime;
+ private Date urlsExpireTime;
+ private String webUrl;
+ private String appUrl;
+ private String imageUrl;
+ private String animatedImageUrl;
+ private List<String> activityZoneIds;
+
+ public Boolean isHasSound() {
+ return hasSound;
+ }
+
+ public Boolean isHasMotion() {
+ return hasMotion;
+ }
+
+ public Boolean isHasPerson() {
+ return hasPerson;
+ }
+
+ public Date getStartTime() {
+ return startTime;
+ }
+
+ public Date getEndTime() {
+ return endTime;
+ }
+
+ public Date getUrlsExpireTime() {
+ return urlsExpireTime;
+ }
+
+ public String getWebUrl() {
+ return webUrl;
+ }
+
+ public String getAppUrl() {
+ return appUrl;
+ }
+
+ public String getImageUrl() {
+ return imageUrl;
+ }
+
+ public String getAnimatedImageUrl() {
+ return animatedImageUrl;
+ }
+
+ public List<String> getActivityZones() {
+ return activityZoneIds;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ CameraEvent other = (CameraEvent) obj;
+ if (activityZoneIds == null) {
+ if (other.activityZoneIds != null) {
+ return false;
+ }
+ } else if (!activityZoneIds.equals(other.activityZoneIds)) {
+ return false;
+ }
+ if (animatedImageUrl == null) {
+ if (other.animatedImageUrl != null) {
+ return false;
+ }
+ } else if (!animatedImageUrl.equals(other.animatedImageUrl)) {
+ return false;
+ }
+ if (appUrl == null) {
+ if (other.appUrl != null) {
+ return false;
+ }
+ } else if (!appUrl.equals(other.appUrl)) {
+ return false;
+ }
+ if (endTime == null) {
+ if (other.endTime != null) {
+ return false;
+ }
+ } else if (!endTime.equals(other.endTime)) {
+ return false;
+ }
+ if (hasMotion == null) {
+ if (other.hasMotion != null) {
+ return false;
+ }
+ } else if (!hasMotion.equals(other.hasMotion)) {
+ return false;
+ }
+ if (hasPerson == null) {
+ if (other.hasPerson != null) {
+ return false;
+ }
+ } else if (!hasPerson.equals(other.hasPerson)) {
+ return false;
+ }
+ if (hasSound == null) {
+ if (other.hasSound != null) {
+ return false;
+ }
+ } else if (!hasSound.equals(other.hasSound)) {
+ return false;
+ }
+ if (imageUrl == null) {
+ if (other.imageUrl != null) {
+ return false;
+ }
+ } else if (!imageUrl.equals(other.imageUrl)) {
+ return false;
+ }
+ if (startTime == null) {
+ if (other.startTime != null) {
+ return false;
+ }
+ } else if (!startTime.equals(other.startTime)) {
+ return false;
+ }
+ if (urlsExpireTime == null) {
+ if (other.urlsExpireTime != null) {
+ return false;
+ }
+ } else if (!urlsExpireTime.equals(other.urlsExpireTime)) {
+ return false;
+ }
+ if (webUrl == null) {
+ if (other.webUrl != null) {
+ return false;
+ }
+ } else if (!webUrl.equals(other.webUrl)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((activityZoneIds == null) ? 0 : activityZoneIds.hashCode());
+ result = prime * result + ((animatedImageUrl == null) ? 0 : animatedImageUrl.hashCode());
+ result = prime * result + ((appUrl == null) ? 0 : appUrl.hashCode());
+ result = prime * result + ((endTime == null) ? 0 : endTime.hashCode());
+ result = prime * result + ((hasMotion == null) ? 0 : hasMotion.hashCode());
+ result = prime * result + ((hasPerson == null) ? 0 : hasPerson.hashCode());
+ result = prime * result + ((hasSound == null) ? 0 : hasSound.hashCode());
+ result = prime * result + ((imageUrl == null) ? 0 : imageUrl.hashCode());
+ result = prime * result + ((startTime == null) ? 0 : startTime.hashCode());
+ result = prime * result + ((urlsExpireTime == null) ? 0 : urlsExpireTime.hashCode());
+ result = prime * result + ((webUrl == null) ? 0 : webUrl.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("Event [hasSound=").append(hasSound).append(", hasMotion=").append(hasMotion)
+ .append(", hasPerson=").append(hasPerson).append(", startTime=").append(startTime).append(", endTime=")
+ .append(endTime).append(", urlsExpireTime=").append(urlsExpireTime).append(", webUrl=").append(webUrl)
+ .append(", appUrl=").append(appUrl).append(", imageUrl=").append(imageUrl).append(", animatedImageUrl=")
+ .append(animatedImageUrl).append(", activityZoneIds=").append(activityZoneIds).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Date;
+
+/**
+ * Used to set and update the ETA values for Nest.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Extract ETA object from Structure
+ * @author Wouter Born - Add equals, hashCode, toString methods
+ */
+public class ETA {
+
+ private String tripId;
+ private Date estimatedArrivalWindowBegin;
+ private Date estimatedArrivalWindowEnd;
+
+ public String getTripId() {
+ return tripId;
+ }
+
+ public void setTripId(String tripId) {
+ this.tripId = tripId;
+ }
+
+ public Date getEstimatedArrivalWindowBegin() {
+ return estimatedArrivalWindowBegin;
+ }
+
+ public void setEstimatedArrivalWindowBegin(Date estimatedArrivalWindowBegin) {
+ this.estimatedArrivalWindowBegin = estimatedArrivalWindowBegin;
+ }
+
+ public Date getEstimatedArrivalWindowEnd() {
+ return estimatedArrivalWindowEnd;
+ }
+
+ public void setEstimatedArrivalWindowEnd(Date estimatedArrivalWindowEnd) {
+ this.estimatedArrivalWindowEnd = estimatedArrivalWindowEnd;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ETA other = (ETA) obj;
+ if (estimatedArrivalWindowBegin == null) {
+ if (other.estimatedArrivalWindowBegin != null) {
+ return false;
+ }
+ } else if (!estimatedArrivalWindowBegin.equals(other.estimatedArrivalWindowBegin)) {
+ return false;
+ }
+ if (estimatedArrivalWindowEnd == null) {
+ if (other.estimatedArrivalWindowEnd != null) {
+ return false;
+ }
+ } else if (!estimatedArrivalWindowEnd.equals(other.estimatedArrivalWindowEnd)) {
+ return false;
+ }
+ if (tripId == null) {
+ if (other.tripId != null) {
+ return false;
+ }
+ } else if (!tripId.equals(other.tripId)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((estimatedArrivalWindowBegin == null) ? 0 : estimatedArrivalWindowBegin.hashCode());
+ result = prime * result + ((estimatedArrivalWindowEnd == null) ? 0 : estimatedArrivalWindowEnd.hashCode());
+ result = prime * result + ((tripId == null) ? 0 : tripId.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("ETA [tripId=").append(tripId).append(", estimatedArrivalWindowBegin=")
+ .append(estimatedArrivalWindowBegin).append(", estimatedArrivalWindowEnd=")
+ .append(estimatedArrivalWindowEnd).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * The data of Nest API errors.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Improve exception handling
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class ErrorData {
+
+ private String error;
+ private String type;
+ private String message;
+ private String instance;
+
+ public String getError() {
+ return error;
+ }
+
+ public String getType() {
+ return type;
+ }
+
+ public String getMessage() {
+ return message;
+ }
+
+ public String getInstance() {
+ return instance;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ErrorData other = (ErrorData) obj;
+ if (error == null) {
+ if (other.error != null) {
+ return false;
+ }
+ } else if (!error.equals(other.error)) {
+ return false;
+ }
+ if (instance == null) {
+ if (other.instance != null) {
+ return false;
+ }
+ } else if (!instance.equals(other.instance)) {
+ return false;
+ }
+ if (message == null) {
+ if (other.message != null) {
+ return false;
+ }
+ } else if (!message.equals(other.message)) {
+ return false;
+ }
+ if (type == null) {
+ if (other.type != null) {
+ return false;
+ }
+ } else if (!type.equals(other.type)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((error == null) ? 0 : error.hashCode());
+ result = prime * result + ((instance == null) ? 0 : instance.hashCode());
+ result = prime * result + ((message == null) ? 0 : message.hashCode());
+ result = prime * result + ((type == null) ? 0 : type.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("ErrorData [error=").append(error).append(", type=").append(type).append(", message=")
+ .append(message).append(", instance=").append(instance).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Map;
+
+/**
+ * All the Nest devices broken up by type.
+ *
+ * @author David Bennett - Initial contribution
+ */
+public class NestDevices {
+
+ private Map<String, Thermostat> thermostats;
+ private Map<String, SmokeDetector> smokeCoAlarms;
+ private Map<String, Camera> cameras;
+
+ /** Id to thermostat mapping */
+ public Map<String, Thermostat> getThermostats() {
+ return thermostats;
+ }
+
+ /** Id to camera mapping */
+ public Map<String, Camera> getCameras() {
+ return cameras;
+ }
+
+ /** Id to smoke detector */
+ public Map<String, SmokeDetector> getSmokeCoAlarms() {
+ return smokeCoAlarms;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ NestDevices other = (NestDevices) obj;
+ if (cameras == null) {
+ if (other.cameras != null) {
+ return false;
+ }
+ } else if (!cameras.equals(other.cameras)) {
+ return false;
+ }
+ if (smokeCoAlarms == null) {
+ if (other.smokeCoAlarms != null) {
+ return false;
+ }
+ } else if (!smokeCoAlarms.equals(other.smokeCoAlarms)) {
+ return false;
+ }
+ if (thermostats == null) {
+ if (other.thermostats != null) {
+ return false;
+ }
+ } else if (!thermostats.equals(other.thermostats)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((cameras == null) ? 0 : cameras.hashCode());
+ result = prime * result + ((smokeCoAlarms == null) ? 0 : smokeCoAlarms.hashCode());
+ result = prime * result + ((thermostats == null) ? 0 : thermostats.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("NestDevices [thermostats=").append(thermostats).append(", smokeCoAlarms=").append(smokeCoAlarms)
+ .append(", cameras=").append(cameras).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * Interface for uniquely identifiable Nest objects (device or a structure).
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Simplify working with deviceId and structureId
+ */
+public interface NestIdentifiable {
+
+ /**
+ * Returns the identifier that uniquely identifies the Nest object (deviceId or structureId).
+ */
+ String getId();
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * The meta data in the data downloads from Nest.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class NestMetadata {
+
+ private String accessToken;
+ private String clientVersion;
+
+ public String getAccessToken() {
+ return accessToken;
+ }
+
+ public String getClientVersion() {
+ return clientVersion;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ NestMetadata other = (NestMetadata) obj;
+ if (accessToken == null) {
+ if (other.accessToken != null) {
+ return false;
+ }
+ } else if (!accessToken.equals(other.accessToken)) {
+ return false;
+ }
+ if (clientVersion == null) {
+ if (other.clientVersion != null) {
+ return false;
+ }
+ } else if (!clientVersion.equals(other.clientVersion)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((accessToken == null) ? 0 : accessToken.hashCode());
+ result = prime * result + ((clientVersion == null) ? 0 : clientVersion.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("NestMetadata [accessToken=").append(accessToken).append(", clientVersion=")
+ .append(clientVersion).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Date;
+
+import com.google.gson.annotations.SerializedName;
+
+/**
+ * Data for the Nest smoke detector.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class SmokeDetector extends BaseNestDevice {
+
+ private BatteryHealth batteryHealth;
+ private AlarmState coAlarmState;
+ private Date lastManualTestTime;
+ private AlarmState smokeAlarmState;
+ private Boolean isManualTestActive;
+ private UiColorState uiColorState;
+
+ public UiColorState getUiColorState() {
+ return uiColorState;
+ }
+
+ public BatteryHealth getBatteryHealth() {
+ return batteryHealth;
+ }
+
+ public AlarmState getCoAlarmState() {
+ return coAlarmState;
+ }
+
+ public Date getLastManualTestTime() {
+ return lastManualTestTime;
+ }
+
+ public AlarmState getSmokeAlarmState() {
+ return smokeAlarmState;
+ }
+
+ public Boolean isManualTestActive() {
+ return isManualTestActive;
+ }
+
+ public enum BatteryHealth {
+ @SerializedName("ok")
+ OK,
+ @SerializedName("replace")
+ REPLACE
+ }
+
+ public enum AlarmState {
+ @SerializedName("ok")
+ OK,
+ @SerializedName("emergency")
+ EMERGENCY,
+ @SerializedName("warning")
+ WARNING
+ }
+
+ public enum UiColorState {
+ @SerializedName("gray")
+ GRAY,
+ @SerializedName("green")
+ GREEN,
+ @SerializedName("yellow")
+ YELLOW,
+ @SerializedName("red")
+ RED
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (!super.equals(obj)) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ SmokeDetector other = (SmokeDetector) obj;
+ if (batteryHealth != other.batteryHealth) {
+ return false;
+ }
+ if (coAlarmState != other.coAlarmState) {
+ return false;
+ }
+ if (isManualTestActive == null) {
+ if (other.isManualTestActive != null) {
+ return false;
+ }
+ } else if (!isManualTestActive.equals(other.isManualTestActive)) {
+ return false;
+ }
+ if (lastManualTestTime == null) {
+ if (other.lastManualTestTime != null) {
+ return false;
+ }
+ } else if (!lastManualTestTime.equals(other.lastManualTestTime)) {
+ return false;
+ }
+ if (smokeAlarmState != other.smokeAlarmState) {
+ return false;
+ }
+ if (uiColorState != other.uiColorState) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = super.hashCode();
+ result = prime * result + ((batteryHealth == null) ? 0 : batteryHealth.hashCode());
+ result = prime * result + ((coAlarmState == null) ? 0 : coAlarmState.hashCode());
+ result = prime * result + ((isManualTestActive == null) ? 0 : isManualTestActive.hashCode());
+ result = prime * result + ((lastManualTestTime == null) ? 0 : lastManualTestTime.hashCode());
+ result = prime * result + ((smokeAlarmState == null) ? 0 : smokeAlarmState.hashCode());
+ result = prime * result + ((uiColorState == null) ? 0 : uiColorState.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("SmokeDetector [batteryHealth=").append(batteryHealth).append(", coAlarmState=")
+ .append(coAlarmState).append(", lastManualTestTime=").append(lastManualTestTime)
+ .append(", smokeAlarmState=").append(smokeAlarmState).append(", isManualTestActive=")
+ .append(isManualTestActive).append(", uiColorState=").append(uiColorState).append(", getId()=")
+ .append(getId()).append(", getName()=").append(getName()).append(", getDeviceId()=")
+ .append(getDeviceId()).append(", getLastConnection()=").append(getLastConnection())
+ .append(", isOnline()=").append(isOnline()).append(", getNameLong()=").append(getNameLong())
+ .append(", getSoftwareVersion()=").append(getSoftwareVersion()).append(", getStructureId()=")
+ .append(getStructureId()).append(", getWhereId()=").append(getWhereId()).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Date;
+import java.util.List;
+import java.util.Map;
+
+import org.openhab.binding.nest.internal.data.SmokeDetector.AlarmState;
+
+import com.google.gson.annotations.SerializedName;
+
+/**
+ * The structure details from Nest.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class Structure implements NestIdentifiable {
+
+ private String structureId;
+ private List<String> thermostats;
+ private List<String> smokeCoAlarms;
+ private List<String> cameras;
+ private String countryCode;
+ private String postalCode;
+ private Date peakPeriodStartTime;
+ private Date peakPeriodEndTime;
+ private String timeZone;
+ private Date etaBegin;
+ private SmokeDetector.AlarmState coAlarmState;
+ private SmokeDetector.AlarmState smokeAlarmState;
+ private Boolean rhrEnrollment;
+ private Map<String, Where> wheres;
+ private HomeAwayState away;
+ private String name;
+ private ETA eta;
+ private SecurityState wwnSecurityState;
+
+ @Override
+ public String getId() {
+ return structureId;
+ }
+
+ public HomeAwayState getAway() {
+ return away;
+ }
+
+ public void setAway(HomeAwayState away) {
+ this.away = away;
+ }
+
+ public String getStructureId() {
+ return structureId;
+ }
+
+ public List<String> getThermostats() {
+ return thermostats;
+ }
+
+ public List<String> getSmokeCoAlarms() {
+ return smokeCoAlarms;
+ }
+
+ public List<String> getCameras() {
+ return cameras;
+ }
+
+ public String getCountryCode() {
+ return countryCode;
+ }
+
+ public String getPostalCode() {
+ return postalCode;
+ }
+
+ public Date getPeakPeriodStartTime() {
+ return peakPeriodStartTime;
+ }
+
+ public Date getPeakPeriodEndTime() {
+ return peakPeriodEndTime;
+ }
+
+ public String getTimeZone() {
+ return timeZone;
+ }
+
+ public Date getEtaBegin() {
+ return etaBegin;
+ }
+
+ public AlarmState getCoAlarmState() {
+ return coAlarmState;
+ }
+
+ public AlarmState getSmokeAlarmState() {
+ return smokeAlarmState;
+ }
+
+ public Boolean isRhrEnrollment() {
+ return rhrEnrollment;
+ }
+
+ public Map<String, Where> getWheres() {
+ return wheres;
+ }
+
+ public ETA getEta() {
+ return eta;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public SecurityState getWwnSecurityState() {
+ return wwnSecurityState;
+ }
+
+ public enum HomeAwayState {
+ @SerializedName("home")
+ HOME,
+ @SerializedName("away")
+ AWAY,
+ @SerializedName("unknown")
+ UNKNOWN
+ }
+
+ public enum SecurityState {
+ @SerializedName("ok")
+ OK,
+ @SerializedName("deter")
+ DETER
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ Structure other = (Structure) obj;
+ if (away != other.away) {
+ return false;
+ }
+ if (cameras == null) {
+ if (other.cameras != null) {
+ return false;
+ }
+ } else if (!cameras.equals(other.cameras)) {
+ return false;
+ }
+ if (coAlarmState != other.coAlarmState) {
+ return false;
+ }
+ if (countryCode == null) {
+ if (other.countryCode != null) {
+ return false;
+ }
+ } else if (!countryCode.equals(other.countryCode)) {
+ return false;
+ }
+ if (eta == null) {
+ if (other.eta != null) {
+ return false;
+ }
+ } else if (!eta.equals(other.eta)) {
+ return false;
+ }
+ if (etaBegin == null) {
+ if (other.etaBegin != null) {
+ return false;
+ }
+ } else if (!etaBegin.equals(other.etaBegin)) {
+ return false;
+ }
+ if (name == null) {
+ if (other.name != null) {
+ return false;
+ }
+ } else if (!name.equals(other.name)) {
+ return false;
+ }
+ if (peakPeriodEndTime == null) {
+ if (other.peakPeriodEndTime != null) {
+ return false;
+ }
+ } else if (!peakPeriodEndTime.equals(other.peakPeriodEndTime)) {
+ return false;
+ }
+ if (peakPeriodStartTime == null) {
+ if (other.peakPeriodStartTime != null) {
+ return false;
+ }
+ } else if (!peakPeriodStartTime.equals(other.peakPeriodStartTime)) {
+ return false;
+ }
+ if (postalCode == null) {
+ if (other.postalCode != null) {
+ return false;
+ }
+ } else if (!postalCode.equals(other.postalCode)) {
+ return false;
+ }
+ if (rhrEnrollment == null) {
+ if (other.rhrEnrollment != null) {
+ return false;
+ }
+ } else if (!rhrEnrollment.equals(other.rhrEnrollment)) {
+ return false;
+ }
+ if (smokeAlarmState != other.smokeAlarmState) {
+ return false;
+ }
+ if (smokeCoAlarms == null) {
+ if (other.smokeCoAlarms != null) {
+ return false;
+ }
+ } else if (!smokeCoAlarms.equals(other.smokeCoAlarms)) {
+ return false;
+ }
+ if (structureId == null) {
+ if (other.structureId != null) {
+ return false;
+ }
+ } else if (!structureId.equals(other.structureId)) {
+ return false;
+ }
+ if (thermostats == null) {
+ if (other.thermostats != null) {
+ return false;
+ }
+ } else if (!thermostats.equals(other.thermostats)) {
+ return false;
+ }
+ if (timeZone == null) {
+ if (other.timeZone != null) {
+ return false;
+ }
+ } else if (!timeZone.equals(other.timeZone)) {
+ return false;
+ }
+ if (wheres == null) {
+ if (other.wheres != null) {
+ return false;
+ }
+ } else if (!wheres.equals(other.wheres)) {
+ return false;
+ }
+ if (wwnSecurityState != other.wwnSecurityState) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((away == null) ? 0 : away.hashCode());
+ result = prime * result + ((cameras == null) ? 0 : cameras.hashCode());
+ result = prime * result + ((coAlarmState == null) ? 0 : coAlarmState.hashCode());
+ result = prime * result + ((countryCode == null) ? 0 : countryCode.hashCode());
+ result = prime * result + ((eta == null) ? 0 : eta.hashCode());
+ result = prime * result + ((etaBegin == null) ? 0 : etaBegin.hashCode());
+ result = prime * result + ((name == null) ? 0 : name.hashCode());
+ result = prime * result + ((peakPeriodEndTime == null) ? 0 : peakPeriodEndTime.hashCode());
+ result = prime * result + ((peakPeriodStartTime == null) ? 0 : peakPeriodStartTime.hashCode());
+ result = prime * result + ((postalCode == null) ? 0 : postalCode.hashCode());
+ result = prime * result + ((rhrEnrollment == null) ? 0 : rhrEnrollment.hashCode());
+ result = prime * result + ((smokeAlarmState == null) ? 0 : smokeAlarmState.hashCode());
+ result = prime * result + ((smokeCoAlarms == null) ? 0 : smokeCoAlarms.hashCode());
+ result = prime * result + ((structureId == null) ? 0 : structureId.hashCode());
+ result = prime * result + ((thermostats == null) ? 0 : thermostats.hashCode());
+ result = prime * result + ((timeZone == null) ? 0 : timeZone.hashCode());
+ result = prime * result + ((wheres == null) ? 0 : wheres.hashCode());
+ result = prime * result + ((wwnSecurityState == null) ? 0 : wwnSecurityState.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("Structure [structureId=").append(structureId).append(", thermostats=").append(thermostats)
+ .append(", smokeCoAlarms=").append(smokeCoAlarms).append(", cameras=").append(cameras)
+ .append(", countryCode=").append(countryCode).append(", postalCode=").append(postalCode)
+ .append(", peakPeriodStartTime=").append(peakPeriodStartTime).append(", peakPeriodEndTime=")
+ .append(peakPeriodEndTime).append(", timeZone=").append(timeZone).append(", etaBegin=").append(etaBegin)
+ .append(", coAlarmState=").append(coAlarmState).append(", smokeAlarmState=").append(smokeAlarmState)
+ .append(", rhrEnrollment=").append(rhrEnrollment).append(", wheres=").append(wheres).append(", away=")
+ .append(away).append(", name=").append(name).append(", eta=").append(eta).append(", wwnSecurityState=")
+ .append(wwnSecurityState).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import static org.openhab.core.library.unit.ImperialUnits.FAHRENHEIT;
+import static org.openhab.core.library.unit.SIUnits.CELSIUS;
+
+import java.util.Date;
+
+import javax.measure.Unit;
+import javax.measure.quantity.Temperature;
+
+import com.google.gson.annotations.SerializedName;
+
+/**
+ * Gson class to encapsulate the data for the Nest thermostat.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class Thermostat extends BaseNestDevice {
+
+ private Boolean canCool;
+ private Boolean canHeat;
+ private Boolean isUsingEmergencyHeat;
+ private Boolean hasFan;
+ private Boolean fanTimerActive;
+ private Date fanTimerTimeout;
+ private Boolean hasLeaf;
+ private String temperatureScale;
+ private Double ambientTemperatureC;
+ private Double ambientTemperatureF;
+ private Integer humidity;
+ private Double targetTemperatureC;
+ private Double targetTemperatureF;
+ private Double targetTemperatureHighC;
+ private Double targetTemperatureHighF;
+ private Double targetTemperatureLowC;
+ private Double targetTemperatureLowF;
+ private Mode hvacMode;
+ private Mode previousHvacMode;
+ private State hvacState;
+ private Double ecoTemperatureHighC;
+ private Double ecoTemperatureHighF;
+ private Double ecoTemperatureLowC;
+ private Double ecoTemperatureLowF;
+ private Boolean isLocked;
+ private Double lockedTempMaxC;
+ private Double lockedTempMaxF;
+ private Double lockedTempMinC;
+ private Double lockedTempMinF;
+ private Boolean sunlightCorrectionEnabled;
+ private Boolean sunlightCorrectionActive;
+ private Integer fanTimerDuration;
+ private String timeToTarget;
+ private String whereName;
+
+ public Unit<Temperature> getTemperatureUnit() {
+ if ("C".equals(temperatureScale)) {
+ return CELSIUS;
+ } else if ("F".equals(temperatureScale)) {
+ return FAHRENHEIT;
+ } else {
+ return null;
+ }
+ }
+
+ public Double getTargetTemperature() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return targetTemperatureC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return targetTemperatureF;
+ } else {
+ return null;
+ }
+ }
+
+ public Double getTargetTemperatureHigh() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return targetTemperatureHighC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return targetTemperatureHighF;
+ } else {
+ return null;
+ }
+ }
+
+ public Double getTargetTemperatureLow() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return targetTemperatureLowC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return targetTemperatureLowF;
+ } else {
+ return null;
+ }
+ }
+
+ public Mode getMode() {
+ return hvacMode;
+ }
+
+ public Double getEcoTemperatureHigh() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return ecoTemperatureHighC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return ecoTemperatureHighF;
+ } else {
+ return null;
+ }
+ }
+
+ public Double getEcoTemperatureLow() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return ecoTemperatureLowC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return ecoTemperatureLowF;
+ } else {
+ return null;
+ }
+ }
+
+ public Boolean isLocked() {
+ return isLocked;
+ }
+
+ public Double getLockedTempMax() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return lockedTempMaxC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return lockedTempMaxF;
+ } else {
+ return null;
+ }
+ }
+
+ public Double getLockedTempMin() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return lockedTempMinC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return lockedTempMinF;
+ } else {
+ return null;
+ }
+ }
+
+ public Boolean isCanCool() {
+ return canCool;
+ }
+
+ public Boolean isCanHeat() {
+ return canHeat;
+ }
+
+ public Boolean isUsingEmergencyHeat() {
+ return isUsingEmergencyHeat;
+ }
+
+ public Boolean isHasFan() {
+ return hasFan;
+ }
+
+ public Boolean isFanTimerActive() {
+ return fanTimerActive;
+ }
+
+ public Date getFanTimerTimeout() {
+ return fanTimerTimeout;
+ }
+
+ public Boolean isHasLeaf() {
+ return hasLeaf;
+ }
+
+ public Mode getPreviousHvacMode() {
+ return previousHvacMode;
+ }
+
+ public State getHvacState() {
+ return hvacState;
+ }
+
+ public Boolean isSunlightCorrectionEnabled() {
+ return sunlightCorrectionEnabled;
+ }
+
+ public Boolean isSunlightCorrectionActive() {
+ return sunlightCorrectionActive;
+ }
+
+ public Integer getFanTimerDuration() {
+ return fanTimerDuration;
+ }
+
+ public Integer getTimeToTarget() {
+ return parseTimeToTarget(timeToTarget);
+ }
+
+ /*
+ * Turns the time to target string into a real value.
+ */
+ static Integer parseTimeToTarget(String timeToTarget) {
+ if (timeToTarget == null) {
+ return null;
+ } else if (timeToTarget.startsWith("~") || timeToTarget.startsWith("<") || timeToTarget.startsWith(">")) {
+ return Integer.valueOf(timeToTarget.substring(1));
+ }
+ return Integer.valueOf(timeToTarget);
+ }
+
+ public String getWhereName() {
+ return whereName;
+ }
+
+ public Double getAmbientTemperature() {
+ if (getTemperatureUnit() == CELSIUS) {
+ return ambientTemperatureC;
+ } else if (getTemperatureUnit() == FAHRENHEIT) {
+ return ambientTemperatureF;
+ } else {
+ return null;
+ }
+ }
+
+ public Integer getHumidity() {
+ return humidity;
+ }
+
+ public enum Mode {
+ @SerializedName("heat")
+ HEAT,
+ @SerializedName("cool")
+ COOL,
+ @SerializedName("heat-cool")
+ HEAT_COOL,
+ @SerializedName("eco")
+ ECO,
+ @SerializedName("off")
+ OFF
+ }
+
+ public enum State {
+ @SerializedName("heating")
+ HEATING,
+ @SerializedName("cooling")
+ COOLING,
+ @SerializedName("off")
+ OFF
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (!super.equals(obj)) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ Thermostat other = (Thermostat) obj;
+ if (ambientTemperatureC == null) {
+ if (other.ambientTemperatureC != null) {
+ return false;
+ }
+ } else if (!ambientTemperatureC.equals(other.ambientTemperatureC)) {
+ return false;
+ }
+ if (ambientTemperatureF == null) {
+ if (other.ambientTemperatureF != null) {
+ return false;
+ }
+ } else if (!ambientTemperatureF.equals(other.ambientTemperatureF)) {
+ return false;
+ }
+ if (canCool == null) {
+ if (other.canCool != null) {
+ return false;
+ }
+ } else if (!canCool.equals(other.canCool)) {
+ return false;
+ }
+ if (canHeat == null) {
+ if (other.canHeat != null) {
+ return false;
+ }
+ } else if (!canHeat.equals(other.canHeat)) {
+ return false;
+ }
+ if (ecoTemperatureHighC == null) {
+ if (other.ecoTemperatureHighC != null) {
+ return false;
+ }
+ } else if (!ecoTemperatureHighC.equals(other.ecoTemperatureHighC)) {
+ return false;
+ }
+ if (ecoTemperatureHighF == null) {
+ if (other.ecoTemperatureHighF != null) {
+ return false;
+ }
+ } else if (!ecoTemperatureHighF.equals(other.ecoTemperatureHighF)) {
+ return false;
+ }
+ if (ecoTemperatureLowC == null) {
+ if (other.ecoTemperatureLowC != null) {
+ return false;
+ }
+ } else if (!ecoTemperatureLowC.equals(other.ecoTemperatureLowC)) {
+ return false;
+ }
+ if (ecoTemperatureLowF == null) {
+ if (other.ecoTemperatureLowF != null) {
+ return false;
+ }
+ } else if (!ecoTemperatureLowF.equals(other.ecoTemperatureLowF)) {
+ return false;
+ }
+ if (fanTimerActive == null) {
+ if (other.fanTimerActive != null) {
+ return false;
+ }
+ } else if (!fanTimerActive.equals(other.fanTimerActive)) {
+ return false;
+ }
+ if (fanTimerDuration == null) {
+ if (other.fanTimerDuration != null) {
+ return false;
+ }
+ } else if (!fanTimerDuration.equals(other.fanTimerDuration)) {
+ return false;
+ }
+ if (fanTimerTimeout == null) {
+ if (other.fanTimerTimeout != null) {
+ return false;
+ }
+ } else if (!fanTimerTimeout.equals(other.fanTimerTimeout)) {
+ return false;
+ }
+ if (hasFan == null) {
+ if (other.hasFan != null) {
+ return false;
+ }
+ } else if (!hasFan.equals(other.hasFan)) {
+ return false;
+ }
+ if (hasLeaf == null) {
+ if (other.hasLeaf != null) {
+ return false;
+ }
+ } else if (!hasLeaf.equals(other.hasLeaf)) {
+ return false;
+ }
+ if (humidity == null) {
+ if (other.humidity != null) {
+ return false;
+ }
+ } else if (!humidity.equals(other.humidity)) {
+ return false;
+ }
+ if (hvacMode != other.hvacMode) {
+ return false;
+ }
+ if (hvacState != other.hvacState) {
+ return false;
+ }
+ if (isLocked == null) {
+ if (other.isLocked != null) {
+ return false;
+ }
+ } else if (!isLocked.equals(other.isLocked)) {
+ return false;
+ }
+ if (isUsingEmergencyHeat == null) {
+ if (other.isUsingEmergencyHeat != null) {
+ return false;
+ }
+ } else if (!isUsingEmergencyHeat.equals(other.isUsingEmergencyHeat)) {
+ return false;
+ }
+ if (lockedTempMaxC == null) {
+ if (other.lockedTempMaxC != null) {
+ return false;
+ }
+ } else if (!lockedTempMaxC.equals(other.lockedTempMaxC)) {
+ return false;
+ }
+ if (lockedTempMaxF == null) {
+ if (other.lockedTempMaxF != null) {
+ return false;
+ }
+ } else if (!lockedTempMaxF.equals(other.lockedTempMaxF)) {
+ return false;
+ }
+ if (lockedTempMinC == null) {
+ if (other.lockedTempMinC != null) {
+ return false;
+ }
+ } else if (!lockedTempMinC.equals(other.lockedTempMinC)) {
+ return false;
+ }
+ if (lockedTempMinF == null) {
+ if (other.lockedTempMinF != null) {
+ return false;
+ }
+ } else if (!lockedTempMinF.equals(other.lockedTempMinF)) {
+ return false;
+ }
+ if (previousHvacMode != other.previousHvacMode) {
+ return false;
+ }
+ if (sunlightCorrectionActive == null) {
+ if (other.sunlightCorrectionActive != null) {
+ return false;
+ }
+ } else if (!sunlightCorrectionActive.equals(other.sunlightCorrectionActive)) {
+ return false;
+ }
+ if (sunlightCorrectionEnabled == null) {
+ if (other.sunlightCorrectionEnabled != null) {
+ return false;
+ }
+ } else if (!sunlightCorrectionEnabled.equals(other.sunlightCorrectionEnabled)) {
+ return false;
+ }
+ if (targetTemperatureC == null) {
+ if (other.targetTemperatureC != null) {
+ return false;
+ }
+ } else if (!targetTemperatureC.equals(other.targetTemperatureC)) {
+ return false;
+ }
+ if (targetTemperatureF == null) {
+ if (other.targetTemperatureF != null) {
+ return false;
+ }
+ } else if (!targetTemperatureF.equals(other.targetTemperatureF)) {
+ return false;
+ }
+ if (targetTemperatureHighC == null) {
+ if (other.targetTemperatureHighC != null) {
+ return false;
+ }
+ } else if (!targetTemperatureHighC.equals(other.targetTemperatureHighC)) {
+ return false;
+ }
+ if (targetTemperatureHighF == null) {
+ if (other.targetTemperatureHighF != null) {
+ return false;
+ }
+ } else if (!targetTemperatureHighF.equals(other.targetTemperatureHighF)) {
+ return false;
+ }
+ if (targetTemperatureLowC == null) {
+ if (other.targetTemperatureLowC != null) {
+ return false;
+ }
+ } else if (!targetTemperatureLowC.equals(other.targetTemperatureLowC)) {
+ return false;
+ }
+ if (targetTemperatureLowF == null) {
+ if (other.targetTemperatureLowF != null) {
+ return false;
+ }
+ } else if (!targetTemperatureLowF.equals(other.targetTemperatureLowF)) {
+ return false;
+ }
+ if (temperatureScale == null) {
+ if (other.temperatureScale != null) {
+ return false;
+ }
+ } else if (!temperatureScale.equals(other.temperatureScale)) {
+ return false;
+ }
+ if (timeToTarget == null) {
+ if (other.timeToTarget != null) {
+ return false;
+ }
+ } else if (!timeToTarget.equals(other.timeToTarget)) {
+ return false;
+ }
+ if (whereName == null) {
+ if (other.whereName != null) {
+ return false;
+ }
+ } else if (!whereName.equals(other.whereName)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = super.hashCode();
+ result = prime * result + ((ambientTemperatureC == null) ? 0 : ambientTemperatureC.hashCode());
+ result = prime * result + ((ambientTemperatureF == null) ? 0 : ambientTemperatureF.hashCode());
+ result = prime * result + ((canCool == null) ? 0 : canCool.hashCode());
+ result = prime * result + ((canHeat == null) ? 0 : canHeat.hashCode());
+ result = prime * result + ((ecoTemperatureHighC == null) ? 0 : ecoTemperatureHighC.hashCode());
+ result = prime * result + ((ecoTemperatureHighF == null) ? 0 : ecoTemperatureHighF.hashCode());
+ result = prime * result + ((ecoTemperatureLowC == null) ? 0 : ecoTemperatureLowC.hashCode());
+ result = prime * result + ((ecoTemperatureLowF == null) ? 0 : ecoTemperatureLowF.hashCode());
+ result = prime * result + ((fanTimerActive == null) ? 0 : fanTimerActive.hashCode());
+ result = prime * result + ((fanTimerDuration == null) ? 0 : fanTimerDuration.hashCode());
+ result = prime * result + ((fanTimerTimeout == null) ? 0 : fanTimerTimeout.hashCode());
+ result = prime * result + ((hasFan == null) ? 0 : hasFan.hashCode());
+ result = prime * result + ((hasLeaf == null) ? 0 : hasLeaf.hashCode());
+ result = prime * result + ((humidity == null) ? 0 : humidity.hashCode());
+ result = prime * result + ((hvacMode == null) ? 0 : hvacMode.hashCode());
+ result = prime * result + ((hvacState == null) ? 0 : hvacState.hashCode());
+ result = prime * result + ((isLocked == null) ? 0 : isLocked.hashCode());
+ result = prime * result + ((isUsingEmergencyHeat == null) ? 0 : isUsingEmergencyHeat.hashCode());
+ result = prime * result + ((lockedTempMaxC == null) ? 0 : lockedTempMaxC.hashCode());
+ result = prime * result + ((lockedTempMaxF == null) ? 0 : lockedTempMaxF.hashCode());
+ result = prime * result + ((lockedTempMinC == null) ? 0 : lockedTempMinC.hashCode());
+ result = prime * result + ((lockedTempMinF == null) ? 0 : lockedTempMinF.hashCode());
+ result = prime * result + ((previousHvacMode == null) ? 0 : previousHvacMode.hashCode());
+ result = prime * result + ((sunlightCorrectionActive == null) ? 0 : sunlightCorrectionActive.hashCode());
+ result = prime * result + ((sunlightCorrectionEnabled == null) ? 0 : sunlightCorrectionEnabled.hashCode());
+ result = prime * result + ((targetTemperatureC == null) ? 0 : targetTemperatureC.hashCode());
+ result = prime * result + ((targetTemperatureF == null) ? 0 : targetTemperatureF.hashCode());
+ result = prime * result + ((targetTemperatureHighC == null) ? 0 : targetTemperatureHighC.hashCode());
+ result = prime * result + ((targetTemperatureHighF == null) ? 0 : targetTemperatureHighF.hashCode());
+ result = prime * result + ((targetTemperatureLowC == null) ? 0 : targetTemperatureLowC.hashCode());
+ result = prime * result + ((targetTemperatureLowF == null) ? 0 : targetTemperatureLowF.hashCode());
+ result = prime * result + ((temperatureScale == null) ? 0 : temperatureScale.hashCode());
+ result = prime * result + ((timeToTarget == null) ? 0 : timeToTarget.hashCode());
+ result = prime * result + ((whereName == null) ? 0 : whereName.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("Thermostat [canCool=").append(canCool).append(", canHeat=").append(canHeat)
+ .append(", isUsingEmergencyHeat=").append(isUsingEmergencyHeat).append(", hasFan=").append(hasFan)
+ .append(", fanTimerActive=").append(fanTimerActive).append(", fanTimerTimeout=").append(fanTimerTimeout)
+ .append(", hasLeaf=").append(hasLeaf).append(", temperatureScale=").append(temperatureScale)
+ .append(", ambientTemperatureC=").append(ambientTemperatureC).append(", ambientTemperatureF=")
+ .append(ambientTemperatureF).append(", humidity=").append(humidity).append(", targetTemperatureC=")
+ .append(targetTemperatureC).append(", targetTemperatureF=").append(targetTemperatureF)
+ .append(", targetTemperatureHighC=").append(targetTemperatureHighC).append(", targetTemperatureHighF=")
+ .append(targetTemperatureHighF).append(", targetTemperatureLowC=").append(targetTemperatureLowC)
+ .append(", targetTemperatureLowF=").append(targetTemperatureLowF).append(", hvacMode=").append(hvacMode)
+ .append(", previousHvacMode=").append(previousHvacMode).append(", hvacState=").append(hvacState)
+ .append(", ecoTemperatureHighC=").append(ecoTemperatureHighC).append(", ecoTemperatureHighF=")
+ .append(ecoTemperatureHighF).append(", ecoTemperatureLowC=").append(ecoTemperatureLowC)
+ .append(", ecoTemperatureLowF=").append(ecoTemperatureLowF).append(", isLocked=").append(isLocked)
+ .append(", lockedTempMaxC=").append(lockedTempMaxC).append(", lockedTempMaxF=").append(lockedTempMaxF)
+ .append(", lockedTempMinC=").append(lockedTempMinC).append(", lockedTempMinF=").append(lockedTempMinF)
+ .append(", sunlightCorrectionEnabled=").append(sunlightCorrectionEnabled)
+ .append(", sunlightCorrectionActive=").append(sunlightCorrectionActive).append(", fanTimerDuration=")
+ .append(fanTimerDuration).append(", timeToTarget=").append(timeToTarget).append(", whereName=")
+ .append(whereName).append(", getId()=").append(getId()).append(", getName()=").append(getName())
+ .append(", getDeviceId()=").append(getDeviceId()).append(", getLastConnection()=")
+ .append(getLastConnection()).append(", isOnline()=").append(isOnline()).append(", getNameLong()=")
+ .append(getNameLong()).append(", getSoftwareVersion()=").append(getSoftwareVersion())
+ .append(", getStructureId()=").append(getStructureId()).append(", getWhereId()=").append(getWhereId())
+ .append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.util.Map;
+
+/**
+ * Top level data for all the Nest stuff, this is the format the Nest data comes back from Nest in.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class TopLevelData {
+
+ private NestDevices devices;
+ private NestMetadata metadata;
+ private Map<String, Structure> structures;
+
+ public NestDevices getDevices() {
+ return devices;
+ }
+
+ public NestMetadata getMetadata() {
+ return metadata;
+ }
+
+ public Map<String, Structure> getStructures() {
+ return structures;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ TopLevelData other = (TopLevelData) obj;
+ if (devices == null) {
+ if (other.devices != null) {
+ return false;
+ }
+ } else if (!devices.equals(other.devices)) {
+ return false;
+ }
+ if (metadata == null) {
+ if (other.metadata != null) {
+ return false;
+ }
+ } else if (!metadata.equals(other.metadata)) {
+ return false;
+ }
+ if (structures == null) {
+ if (other.structures != null) {
+ return false;
+ }
+ } else if (!structures.equals(other.structures)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((devices == null) ? 0 : devices.hashCode());
+ result = prime * result + ((metadata == null) ? 0 : metadata.hashCode());
+ result = prime * result + ((structures == null) ? 0 : structures.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("TopLevelData [devices=").append(devices).append(", metadata=").append(metadata)
+ .append(", structures=").append(structures).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * The top level data that is sent by Nest to a streaming REST client using SSE.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Replace polling with REST streaming
+ * @author Wouter Born - Add equals and hashCode methods
+ */
+public class TopLevelStreamingData {
+
+ private String path;
+ private TopLevelData data;
+
+ public String getPath() {
+ return path;
+ }
+
+ public TopLevelData getData() {
+ return data;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((data == null) ? 0 : data.hashCode());
+ result = prime * result + ((path == null) ? 0 : path.hashCode());
+ return result;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ TopLevelStreamingData other = (TopLevelStreamingData) obj;
+ if (data == null) {
+ if (other.data != null) {
+ return false;
+ }
+ } else if (!data.equals(other.data)) {
+ return false;
+ }
+ if (path == null) {
+ if (other.path != null) {
+ return false;
+ }
+ } else if (!path.equals(other.path)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("TopLevelStreamingData [path=").append(path).append(", data=").append(data).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+/**
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Extract Where object from Structure
+ * @author Wouter Born - Add equals, hashCode, toString methods
+ */
+public class Where {
+ private String whereId;
+ private String name;
+
+ public String getWhereId() {
+ return whereId;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ Where other = (Where) obj;
+ if (name == null) {
+ if (other.name != null) {
+ return false;
+ }
+ } else if (!name.equals(other.name)) {
+ return false;
+ }
+ if (whereId == null) {
+ if (other.whereId != null) {
+ return false;
+ }
+ } else if (!whereId.equals(other.whereId)) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((name == null) ? 0 : name.hashCode());
+ result = prime * result + ((whereId == null) ? 0 : whereId.hashCode());
+ return result;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("Where [whereId=").append(whereId).append(", name=").append(name).append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.discovery;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.core.thing.Thing.PROPERTY_FIRMWARE_VERSION;
+
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.BiConsumer;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.binding.nest.internal.config.NestDeviceConfiguration;
+import org.openhab.binding.nest.internal.config.NestStructureConfiguration;
+import org.openhab.binding.nest.internal.data.BaseNestDevice;
+import org.openhab.binding.nest.internal.data.Camera;
+import org.openhab.binding.nest.internal.data.SmokeDetector;
+import org.openhab.binding.nest.internal.data.Structure;
+import org.openhab.binding.nest.internal.data.Thermostat;
+import org.openhab.binding.nest.internal.handler.NestBridgeHandler;
+import org.openhab.binding.nest.internal.listener.NestThingDataListener;
+import org.openhab.core.config.discovery.AbstractDiscoveryService;
+import org.openhab.core.config.discovery.DiscoveryResultBuilder;
+import org.openhab.core.thing.ThingTypeUID;
+import org.openhab.core.thing.ThingUID;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This service connects to the Nest bridge and creates the correct discovery results for Nest devices
+ * as they are found through the API.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Add representation properties
+ */
+@NonNullByDefault
+public class NestDiscoveryService extends AbstractDiscoveryService {
+
+ private static final Set<ThingTypeUID> SUPPORTED_THING_TYPES = Stream
+ .of(THING_TYPE_CAMERA, THING_TYPE_THERMOSTAT, THING_TYPE_SMOKE_DETECTOR, THING_TYPE_STRUCTURE)
+ .collect(Collectors.toSet());
+
+ private final Logger logger = LoggerFactory.getLogger(NestDiscoveryService.class);
+
+ private final DiscoveryDataListener<Camera> cameraDiscoveryDataListener = new DiscoveryDataListener<>(Camera.class,
+ THING_TYPE_CAMERA, this::addDeviceDiscoveryResult);
+ private final DiscoveryDataListener<SmokeDetector> smokeDetectorDiscoveryDataListener = new DiscoveryDataListener<>(
+ SmokeDetector.class, THING_TYPE_SMOKE_DETECTOR, this::addDeviceDiscoveryResult);
+ private final DiscoveryDataListener<Structure> structureDiscoveryDataListener = new DiscoveryDataListener<>(
+ Structure.class, THING_TYPE_STRUCTURE, this::addStructureDiscoveryResult);
+ private final DiscoveryDataListener<Thermostat> thermostatDiscoveryDataListener = new DiscoveryDataListener<>(
+ Thermostat.class, THING_TYPE_THERMOSTAT, this::addDeviceDiscoveryResult);
+
+ @SuppressWarnings("rawtypes")
+ private final List<DiscoveryDataListener> discoveryDataListeners = Stream.of(cameraDiscoveryDataListener,
+ smokeDetectorDiscoveryDataListener, structureDiscoveryDataListener, thermostatDiscoveryDataListener)
+ .collect(Collectors.toList());
+
+ private final NestBridgeHandler bridge;
+
+ private static class DiscoveryDataListener<T> implements NestThingDataListener<T> {
+ private Class<T> dataClass;
+ private ThingTypeUID thingTypeUID;
+ private BiConsumer<T, ThingTypeUID> onDiscovered;
+
+ private DiscoveryDataListener(Class<T> dataClass, ThingTypeUID thingTypeUID,
+ BiConsumer<T, ThingTypeUID> onDiscovered) {
+ this.dataClass = dataClass;
+ this.thingTypeUID = thingTypeUID;
+ this.onDiscovered = onDiscovered;
+ }
+
+ @Override
+ public void onNewData(T data) {
+ onDiscovered.accept(data, thingTypeUID);
+ }
+
+ @Override
+ public void onUpdatedData(T oldData, T data) {
+ }
+
+ @Override
+ public void onMissingData(String nestId) {
+ }
+ }
+
+ public NestDiscoveryService(NestBridgeHandler bridge) {
+ super(SUPPORTED_THING_TYPES, 60, true);
+ this.bridge = bridge;
+ }
+
+ @SuppressWarnings("unchecked")
+ public void activate() {
+ discoveryDataListeners.forEach(l -> bridge.addThingDataListener(l.dataClass, l));
+ addDiscoveryResultsFromLastUpdates();
+ }
+
+ @Override
+ @SuppressWarnings("unchecked")
+ public void deactivate() {
+ discoveryDataListeners.forEach(l -> bridge.removeThingDataListener(l.dataClass, l));
+ }
+
+ @Override
+ protected void startScan() {
+ addDiscoveryResultsFromLastUpdates();
+ }
+
+ @SuppressWarnings("unchecked")
+ private void addDiscoveryResultsFromLastUpdates() {
+ discoveryDataListeners
+ .forEach(l -> addDiscoveryResultsFromLastUpdates(l.dataClass, l.thingTypeUID, l.onDiscovered));
+ }
+
+ private <T> void addDiscoveryResultsFromLastUpdates(Class<T> dataClass, ThingTypeUID thingTypeUID,
+ BiConsumer<T, ThingTypeUID> onDiscovered) {
+ List<T> lastUpdates = bridge.getLastUpdates(dataClass);
+ lastUpdates.forEach(lastUpdate -> onDiscovered.accept(lastUpdate, thingTypeUID));
+ }
+
+ private void addDeviceDiscoveryResult(BaseNestDevice device, ThingTypeUID typeUID) {
+ ThingUID bridgeUID = bridge.getThing().getUID();
+ ThingUID thingUID = new ThingUID(typeUID, bridgeUID, device.getDeviceId());
+ logger.debug("Discovered {}", thingUID);
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestDeviceConfiguration.DEVICE_ID, device.getDeviceId());
+ properties.put(PROPERTY_FIRMWARE_VERSION, device.getSoftwareVersion());
+ // @formatter:off
+ thingDiscovered(DiscoveryResultBuilder.create(thingUID)
+ .withThingType(typeUID)
+ .withLabel(device.getNameLong())
+ .withBridge(bridgeUID)
+ .withProperties(properties)
+ .withRepresentationProperty(NestDeviceConfiguration.DEVICE_ID)
+ .build()
+ );
+ // @formatter:on
+ }
+
+ public void addStructureDiscoveryResult(Structure structure, ThingTypeUID typeUID) {
+ ThingUID bridgeUID = bridge.getThing().getUID();
+ ThingUID thingUID = new ThingUID(typeUID, bridgeUID, structure.getStructureId());
+ logger.debug("Discovered {}", thingUID);
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestStructureConfiguration.STRUCTURE_ID, structure.getStructureId());
+ // @formatter:off
+ thingDiscovered(DiscoveryResultBuilder.create(thingUID)
+ .withThingType(THING_TYPE_STRUCTURE)
+ .withLabel(structure.getName())
+ .withBridge(bridgeUID)
+ .withProperties(properties)
+ .withRepresentationProperty(NestStructureConfiguration.STRUCTURE_ID)
+ .build()
+ );
+ // @formatter:on
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.exceptions;
+
+/**
+ * Will be thrown when the bridge was unable to resolve the Nest redirect URL.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Improve exception handling while sending data
+ */
+@SuppressWarnings("serial")
+public class FailedResolvingNestUrlException extends Exception {
+ public FailedResolvingNestUrlException(String message) {
+ super(message);
+ }
+
+ public FailedResolvingNestUrlException(String message, Throwable cause) {
+ super(message, cause);
+ }
+
+ public FailedResolvingNestUrlException(Throwable cause) {
+ super(cause);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.exceptions;
+
+/**
+ * Will be thrown when the bridge was unable to retrieve data.
+ *
+ * @author Martin van Wingerden - Initial contribution
+ * @author Martin van Wingerden - Added more centralized handling of failure when retrieving data
+ */
+@SuppressWarnings("serial")
+public class FailedRetrievingNestDataException extends Exception {
+
+ public FailedRetrievingNestDataException(String message) {
+ super(message);
+ }
+
+ public FailedRetrievingNestDataException(String message, Throwable cause) {
+ super(message, cause);
+ }
+
+ public FailedRetrievingNestDataException(Throwable cause) {
+ super(cause);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.exceptions;
+
+/**
+ * Will be thrown when the bridge was unable to send data.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Improve exception handling while sending data
+ */
+@SuppressWarnings("serial")
+public class FailedSendingNestDataException extends Exception {
+ public FailedSendingNestDataException(String message) {
+ super(message);
+ }
+
+ public FailedSendingNestDataException(String message, Throwable cause) {
+ super(message, cause);
+ }
+
+ public FailedSendingNestDataException(Throwable cause) {
+ super(cause);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.exceptions;
+
+/**
+ * Will be thrown when there is no valid access token and it was not possible to refresh it
+ *
+ * @author Martin van Wingerden - Initial contribution
+ * @author Martin van Wingerden - Added more centralized handling of invalid access tokens
+ */
+@SuppressWarnings("serial")
+public class InvalidAccessTokenException extends Exception {
+ public InvalidAccessTokenException(Exception cause) {
+ super(cause);
+ }
+
+ public InvalidAccessTokenException(String message, Throwable cause) {
+ super(message, cause);
+ }
+
+ public InvalidAccessTokenException(String message) {
+ super(message);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import java.time.Instant;
+import java.time.ZonedDateTime;
+import java.util.Collection;
+import java.util.Date;
+import java.util.TimeZone;
+import java.util.stream.Collectors;
+
+import javax.measure.Quantity;
+import javax.measure.Unit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.config.NestDeviceConfiguration;
+import org.openhab.binding.nest.internal.data.NestIdentifiable;
+import org.openhab.binding.nest.internal.listener.NestThingDataListener;
+import org.openhab.binding.nest.internal.rest.NestUpdateRequest;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.QuantityType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusDetail;
+import org.openhab.core.thing.ThingStatusInfo;
+import org.openhab.core.thing.binding.BaseThingHandler;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Deals with the structures on the Nest API, turning them into a thing in openHAB.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Martin van Wingerden - Splitted of NestBaseHandler
+ * @author Wouter Born - Add generic update data type
+ *
+ * @param <T> the type of update data
+ */
+@NonNullByDefault
+public abstract class NestBaseHandler<T> extends BaseThingHandler
+ implements NestThingDataListener<T>, NestIdentifiable {
+ private final Logger logger = LoggerFactory.getLogger(NestBaseHandler.class);
+
+ private @Nullable String deviceId;
+ private Class<T> dataClass;
+
+ NestBaseHandler(Thing thing, Class<T> dataClass) {
+ super(thing);
+ this.dataClass = dataClass;
+ }
+
+ @Override
+ public void initialize() {
+ logger.debug("Initializing handler for {}", getClass().getName());
+
+ NestBridgeHandler handler = getNestBridgeHandler();
+ if (handler != null) {
+ boolean success = handler.addThingDataListener(dataClass, getId(), this);
+ logger.debug("Adding {} with ID '{}' as device data listener, result: {}", getClass().getSimpleName(),
+ getId(), success);
+ } else {
+ logger.debug("Unable to add {} with ID '{}' as device data listener because bridge is null",
+ getClass().getSimpleName(), getId());
+ }
+
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.NONE, "Waiting for refresh");
+
+ T lastUpdate = getLastUpdate();
+ if (lastUpdate != null) {
+ update(null, lastUpdate);
+ }
+ }
+
+ @Override
+ public void dispose() {
+ NestBridgeHandler handler = getNestBridgeHandler();
+ if (handler != null) {
+ handler.removeThingDataListener(dataClass, getId(), this);
+ }
+ }
+
+ protected @Nullable T getLastUpdate() {
+ NestBridgeHandler handler = getNestBridgeHandler();
+ if (handler != null) {
+ return handler.getLastUpdate(dataClass, getId());
+ }
+ return null;
+ }
+
+ protected void addUpdateRequest(String updatePath, String field, Object value) {
+ NestBridgeHandler handler = getNestBridgeHandler();
+ if (handler != null) {
+ // @formatter:off
+ handler.addUpdateRequest(new NestUpdateRequest.Builder()
+ .withBasePath(updatePath)
+ .withIdentifier(getId())
+ .withAdditionalValue(field, value)
+ .build());
+ // @formatter:on
+ }
+ }
+
+ @Override
+ public String getId() {
+ return getDeviceId();
+ }
+
+ protected String getDeviceId() {
+ String localDeviceId = deviceId;
+ if (localDeviceId == null) {
+ localDeviceId = getConfigAs(NestDeviceConfiguration.class).deviceId;
+ deviceId = localDeviceId;
+ }
+ return localDeviceId;
+ }
+
+ protected @Nullable NestBridgeHandler getNestBridgeHandler() {
+ Bridge bridge = getBridge();
+ return bridge != null ? (NestBridgeHandler) bridge.getHandler() : null;
+ }
+
+ protected abstract State getChannelState(ChannelUID channelUID, T data);
+
+ protected State getAsDateTimeTypeOrNull(@Nullable Date date) {
+ if (date == null) {
+ return UnDefType.NULL;
+ }
+
+ long offsetMillis = TimeZone.getDefault().getOffset(date.getTime());
+ Instant instant = date.toInstant().plusMillis(offsetMillis);
+ return new DateTimeType(ZonedDateTime.ofInstant(instant, TimeZone.getDefault().toZoneId()));
+ }
+
+ protected State getAsDecimalTypeOrNull(@Nullable Integer value) {
+ return value == null ? UnDefType.NULL : new DecimalType(value);
+ }
+
+ protected State getAsOnOffTypeOrNull(@Nullable Boolean value) {
+ return value == null ? UnDefType.NULL : value ? OnOffType.ON : OnOffType.OFF;
+ }
+
+ protected <U extends Quantity<U>> State getAsQuantityTypeOrNull(@Nullable Number value, Unit<U> unit) {
+ return value == null ? UnDefType.NULL : new QuantityType<>(value, unit);
+ }
+
+ protected State getAsStringTypeOrNull(@Nullable Object value) {
+ return value == null ? UnDefType.NULL : new StringType(value.toString());
+ }
+
+ protected State getAsStringTypeListOrNull(@Nullable Collection<?> values) {
+ return values == null || values.isEmpty() ? UnDefType.NULL
+ : new StringType(values.stream().map(v -> v.toString()).collect(Collectors.joining(",")));
+ }
+
+ protected boolean isNotHandling(NestIdentifiable nestIdentifiable) {
+ return !(getId().equals(nestIdentifiable.getId()));
+ }
+
+ protected void updateLinkedChannels(T oldData, T data) {
+ getThing().getChannels().stream().map(c -> c.getUID()).filter(this::isLinked).forEach(channelUID -> {
+ State newState = getChannelState(channelUID, data);
+ if (oldData == null || !getChannelState(channelUID, oldData).equals(newState)) {
+ logger.debug("Updating {}", channelUID);
+ updateState(channelUID, newState);
+ }
+ });
+ }
+
+ @Override
+ public void onNewData(T data) {
+ update(null, data);
+ }
+
+ @Override
+ public void onUpdatedData(T oldData, T data) {
+ update(oldData, data);
+ }
+
+ @Override
+ public void onMissingData(String nestId) {
+ thing.setStatusInfo(
+ new ThingStatusInfo(ThingStatus.OFFLINE, ThingStatusDetail.GONE, "Missing from streaming updates"));
+ }
+
+ protected abstract void update(T oldData, T data);
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import static java.util.concurrent.TimeUnit.SECONDS;
+import static org.openhab.binding.nest.internal.NestBindingConstants.JSON_CONTENT_TYPE;
+
+import java.io.ByteArrayInputStream;
+import java.io.IOException;
+import java.nio.charset.StandardCharsets;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Properties;
+import java.util.Set;
+import java.util.concurrent.CopyOnWriteArrayList;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+
+import javax.ws.rs.client.ClientBuilder;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.NestUtils;
+import org.openhab.binding.nest.internal.config.NestBridgeConfiguration;
+import org.openhab.binding.nest.internal.data.ErrorData;
+import org.openhab.binding.nest.internal.data.NestIdentifiable;
+import org.openhab.binding.nest.internal.data.TopLevelData;
+import org.openhab.binding.nest.internal.exceptions.FailedResolvingNestUrlException;
+import org.openhab.binding.nest.internal.exceptions.FailedSendingNestDataException;
+import org.openhab.binding.nest.internal.exceptions.InvalidAccessTokenException;
+import org.openhab.binding.nest.internal.listener.NestStreamingDataListener;
+import org.openhab.binding.nest.internal.listener.NestThingDataListener;
+import org.openhab.binding.nest.internal.rest.NestAuthorizer;
+import org.openhab.binding.nest.internal.rest.NestStreamingRestClient;
+import org.openhab.binding.nest.internal.rest.NestUpdateRequest;
+import org.openhab.binding.nest.internal.update.NestCompositeUpdateHandler;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.io.net.http.HttpUtil;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusDetail;
+import org.openhab.core.thing.binding.BaseBridgeHandler;
+import org.openhab.core.thing.binding.ThingHandler;
+import org.openhab.core.types.Command;
+import org.openhab.core.types.RefreshType;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This bridge handler connects to Nest and handles all the API requests. It pulls down the
+ * updated data, polls the system and does all the co-ordination with the other handlers
+ * to get the data updated to the correct things.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Martin van Wingerden - Use listeners not only for discovery but for all data processing
+ * @author Wouter Born - Improve exception and URL redirect handling
+ */
+@NonNullByDefault
+public class NestBridgeHandler extends BaseBridgeHandler implements NestStreamingDataListener {
+
+ private static final int REQUEST_TIMEOUT = (int) TimeUnit.SECONDS.toMillis(30);
+
+ private final Logger logger = LoggerFactory.getLogger(NestBridgeHandler.class);
+
+ private final ClientBuilder clientBuilder;
+ private final SseEventSourceFactory eventSourceFactory;
+ private final List<NestUpdateRequest> nestUpdateRequests = new CopyOnWriteArrayList<>();
+ private final NestCompositeUpdateHandler updateHandler = new NestCompositeUpdateHandler(
+ this::getPresentThingsNestIds);
+
+ private @NonNullByDefault({}) NestAuthorizer authorizer;
+ private @NonNullByDefault({}) NestBridgeConfiguration config;
+
+ private @Nullable ScheduledFuture<?> initializeJob;
+ private @Nullable ScheduledFuture<?> transmitJob;
+ private @Nullable NestRedirectUrlSupplier redirectUrlSupplier;
+ private @Nullable NestStreamingRestClient streamingRestClient;
+
+ /**
+ * Creates the bridge handler to connect to Nest.
+ *
+ * @param bridge The bridge to connect to Nest with.
+ */
+ public NestBridgeHandler(Bridge bridge, ClientBuilder clientBuilder, SseEventSourceFactory eventSourceFactory) {
+ super(bridge);
+ this.clientBuilder = clientBuilder;
+ this.eventSourceFactory = eventSourceFactory;
+ }
+
+ /**
+ * Initialize the connection to Nest.
+ */
+ @Override
+ public void initialize() {
+ logger.debug("Initializing Nest bridge handler");
+
+ config = getConfigAs(NestBridgeConfiguration.class);
+ authorizer = new NestAuthorizer(config);
+ updateStatus(ThingStatus.UNKNOWN, ThingStatusDetail.NONE, "Starting poll query");
+
+ initializeJob = scheduler.schedule(() -> {
+ try {
+ logger.debug("Product ID {}", config.productId);
+ logger.debug("Product Secret {}", config.productSecret);
+ logger.debug("Pincode {}", config.pincode);
+ logger.debug("Access Token {}", getExistingOrNewAccessToken());
+ redirectUrlSupplier = createRedirectUrlSupplier();
+ restartStreamingUpdates();
+ } catch (InvalidAccessTokenException e) {
+ logger.debug("Invalid access token", e);
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.CONFIGURATION_ERROR,
+ "Token is invalid and could not be refreshed: " + e.getMessage());
+ }
+ }, 0, TimeUnit.SECONDS);
+
+ logger.debug("Finished initializing Nest bridge handler");
+ }
+
+ /**
+ * Clean up the handler.
+ */
+ @Override
+ public void dispose() {
+ logger.debug("Nest bridge disposed");
+ stopStreamingUpdates();
+
+ ScheduledFuture<?> localInitializeJob = initializeJob;
+ if (localInitializeJob != null && !localInitializeJob.isCancelled()) {
+ localInitializeJob.cancel(true);
+ initializeJob = null;
+ }
+
+ ScheduledFuture<?> localTransmitJob = transmitJob;
+ if (localTransmitJob != null && !localTransmitJob.isCancelled()) {
+ localTransmitJob.cancel(true);
+ transmitJob = null;
+ }
+
+ this.authorizer = null;
+ this.redirectUrlSupplier = null;
+ this.streamingRestClient = null;
+ }
+
+ public <T> boolean addThingDataListener(Class<T> dataClass, NestThingDataListener<T> listener) {
+ return updateHandler.addListener(dataClass, listener);
+ }
+
+ public <T> boolean addThingDataListener(Class<T> dataClass, String nestId, NestThingDataListener<T> listener) {
+ return updateHandler.addListener(dataClass, nestId, listener);
+ }
+
+ /**
+ * Adds the update request into the queue for doing something with, send immediately if the queue is empty.
+ */
+ public void addUpdateRequest(NestUpdateRequest request) {
+ nestUpdateRequests.add(request);
+ scheduleTransmitJobForPendingRequests();
+ }
+
+ protected NestRedirectUrlSupplier createRedirectUrlSupplier() throws InvalidAccessTokenException {
+ return new NestRedirectUrlSupplier(getHttpHeaders());
+ }
+
+ private String getExistingOrNewAccessToken() throws InvalidAccessTokenException {
+ String accessToken = config.accessToken;
+ if (accessToken == null || accessToken.isEmpty()) {
+ accessToken = authorizer.getNewAccessToken();
+ config.accessToken = accessToken;
+ config.pincode = "";
+ // Update and save the access token in the bridge configuration
+ Configuration configuration = editConfiguration();
+ configuration.put(NestBridgeConfiguration.ACCESS_TOKEN, config.accessToken);
+ configuration.put(NestBridgeConfiguration.PINCODE, config.pincode);
+ updateConfiguration(configuration);
+ logger.debug("Retrieved new access token: {}", config.accessToken);
+ return accessToken;
+ } else {
+ logger.debug("Re-using access token from configuration: {}", accessToken);
+ return accessToken;
+ }
+ }
+
+ protected Properties getHttpHeaders() throws InvalidAccessTokenException {
+ Properties httpHeaders = new Properties();
+ httpHeaders.put("Authorization", "Bearer " + getExistingOrNewAccessToken());
+ httpHeaders.put("Content-Type", JSON_CONTENT_TYPE);
+ return httpHeaders;
+ }
+
+ public @Nullable <T> T getLastUpdate(Class<T> dataClass, String nestId) {
+ return updateHandler.getLastUpdate(dataClass, nestId);
+ }
+
+ public <T> List<T> getLastUpdates(Class<T> dataClass) {
+ return updateHandler.getLastUpdates(dataClass);
+ }
+
+ private NestRedirectUrlSupplier getOrCreateRedirectUrlSupplier() throws InvalidAccessTokenException {
+ NestRedirectUrlSupplier localRedirectUrlSupplier = redirectUrlSupplier;
+ if (localRedirectUrlSupplier == null) {
+ localRedirectUrlSupplier = createRedirectUrlSupplier();
+ redirectUrlSupplier = localRedirectUrlSupplier;
+ }
+ return localRedirectUrlSupplier;
+ }
+
+ private Set<String> getPresentThingsNestIds() {
+ Set<String> nestIds = new HashSet<>();
+ for (Thing thing : getThing().getThings()) {
+ ThingHandler handler = thing.getHandler();
+ if (handler != null && thing.getStatusInfo().getStatusDetail() != ThingStatusDetail.GONE) {
+ nestIds.add(((NestIdentifiable) handler).getId());
+ }
+ }
+ return nestIds;
+ }
+
+ /**
+ * Handles an incoming command update
+ */
+ @Override
+ public void handleCommand(ChannelUID channelUID, Command command) {
+ if (command instanceof RefreshType) {
+ logger.debug("Refresh command received");
+ updateHandler.resendLastUpdates();
+ }
+ }
+
+ private void jsonToPutUrl(NestUpdateRequest request)
+ throws FailedSendingNestDataException, InvalidAccessTokenException, FailedResolvingNestUrlException {
+ try {
+ NestRedirectUrlSupplier localRedirectUrlSupplier = redirectUrlSupplier;
+ if (localRedirectUrlSupplier == null) {
+ throw new FailedResolvingNestUrlException("redirectUrlSupplier is null");
+ }
+
+ String url = localRedirectUrlSupplier.getRedirectUrl() + request.getUpdatePath();
+ logger.debug("Putting data to: {}", url);
+
+ String jsonContent = NestUtils.toJson(request.getValues());
+ logger.debug("PUT content: {}", jsonContent);
+
+ ByteArrayInputStream inputStream = new ByteArrayInputStream(jsonContent.getBytes(StandardCharsets.UTF_8));
+ String jsonResponse = HttpUtil.executeUrl("PUT", url, getHttpHeaders(), inputStream, JSON_CONTENT_TYPE,
+ REQUEST_TIMEOUT);
+ logger.debug("PUT response: {}", jsonResponse);
+
+ ErrorData error = NestUtils.fromJson(jsonResponse, ErrorData.class);
+ if (error.getError() != null && !error.getError().isBlank()) {
+ logger.debug("Nest API error: {}", error);
+ logger.warn("Nest API error: {}", error.getMessage());
+ }
+ } catch (IOException e) {
+ throw new FailedSendingNestDataException("Failed to send data", e);
+ }
+ }
+
+ @Override
+ public void onAuthorizationRevoked(String token) {
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.CONFIGURATION_ERROR,
+ "Authorization token revoked: " + token);
+ }
+
+ @Override
+ public void onConnected() {
+ updateStatus(ThingStatus.ONLINE, ThingStatusDetail.NONE, "Streaming data connection established");
+ scheduleTransmitJobForPendingRequests();
+ }
+
+ @Override
+ public void onDisconnected() {
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR, "Streaming data disconnected");
+ }
+
+ @Override
+ public void onError(String message) {
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR, message);
+ }
+
+ @Override
+ public void onNewTopLevelData(TopLevelData data) {
+ updateHandler.handleUpdate(data);
+ updateStatus(ThingStatus.ONLINE, ThingStatusDetail.NONE, "Receiving streaming data");
+ }
+
+ public <T> boolean removeThingDataListener(Class<T> dataClass, NestThingDataListener<T> listener) {
+ return updateHandler.removeListener(dataClass, listener);
+ }
+
+ public <T> boolean removeThingDataListener(Class<T> dataClass, String nestId, NestThingDataListener<T> listener) {
+ return updateHandler.removeListener(dataClass, nestId, listener);
+ }
+
+ private void restartStreamingUpdates() {
+ synchronized (this) {
+ stopStreamingUpdates();
+ startStreamingUpdates();
+ }
+ }
+
+ private void scheduleTransmitJobForPendingRequests() {
+ ScheduledFuture<?> localTransmitJob = transmitJob;
+ if (!nestUpdateRequests.isEmpty() && (localTransmitJob == null || localTransmitJob.isDone())) {
+ transmitJob = scheduler.schedule(this::transmitQueue, 0, SECONDS);
+ }
+ }
+
+ private void startStreamingUpdates() {
+ synchronized (this) {
+ try {
+ NestStreamingRestClient localStreamingRestClient = new NestStreamingRestClient(
+ getExistingOrNewAccessToken(), clientBuilder, eventSourceFactory,
+ getOrCreateRedirectUrlSupplier(), scheduler);
+ localStreamingRestClient.addStreamingDataListener(this);
+ localStreamingRestClient.start();
+
+ streamingRestClient = localStreamingRestClient;
+ } catch (InvalidAccessTokenException e) {
+ logger.debug("Invalid access token", e);
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.CONFIGURATION_ERROR,
+ "Token is invalid and could not be refreshed: " + e.getMessage());
+ }
+ }
+ }
+
+ private void stopStreamingUpdates() {
+ NestStreamingRestClient localStreamingRestClient = streamingRestClient;
+ if (localStreamingRestClient != null) {
+ synchronized (this) {
+ localStreamingRestClient.stop();
+ localStreamingRestClient.removeStreamingDataListener(this);
+ streamingRestClient = null;
+ }
+ }
+ }
+
+ private void transmitQueue() {
+ if (getThing().getStatus() == ThingStatus.OFFLINE) {
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR,
+ "Not transmitting events because bridge is OFFLINE");
+ return;
+ }
+
+ try {
+ while (!nestUpdateRequests.isEmpty()) {
+ // nestUpdateRequests is a CopyOnWriteArrayList so its iterator does not support remove operations
+ NestUpdateRequest request = nestUpdateRequests.get(0);
+ jsonToPutUrl(request);
+ nestUpdateRequests.remove(request);
+ }
+ } catch (InvalidAccessTokenException e) {
+ logger.debug("Invalid access token", e);
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.CONFIGURATION_ERROR,
+ "Token is invalid and could not be refreshed: " + e.getMessage());
+ } catch (FailedResolvingNestUrlException e) {
+ logger.debug("Unable to resolve redirect URL", e);
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR, e.getMessage());
+ scheduler.schedule(this::restartStreamingUpdates, 5, SECONDS);
+ } catch (FailedSendingNestDataException e) {
+ logger.debug("Error sending data", e);
+ updateStatus(ThingStatus.OFFLINE, ThingStatusDetail.COMMUNICATION_ERROR, e.getMessage());
+ scheduler.schedule(this::restartStreamingUpdates, 5, SECONDS);
+
+ NestRedirectUrlSupplier localRedirectUrlSupplier = redirectUrlSupplier;
+ if (localRedirectUrlSupplier != null) {
+ localRedirectUrlSupplier.resetCache();
+ }
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.core.thing.Thing.PROPERTY_FIRMWARE_VERSION;
+import static org.openhab.core.types.RefreshType.REFRESH;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.binding.nest.internal.data.Camera;
+import org.openhab.binding.nest.internal.data.CameraEvent;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.types.Command;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Handles all the updates to the camera as well as handling the commands that send
+ * updates to Nest.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Handle channel refresh command
+ */
+@NonNullByDefault
+public class NestCameraHandler extends NestBaseHandler<Camera> {
+ private final Logger logger = LoggerFactory.getLogger(NestCameraHandler.class);
+
+ public NestCameraHandler(Thing thing) {
+ super(thing, Camera.class);
+ }
+
+ @Override
+ protected State getChannelState(ChannelUID channelUID, Camera camera) {
+ if (channelUID.getId().startsWith(CHANNEL_GROUP_CAMERA_PREFIX)) {
+ return getCameraChannelState(channelUID, camera);
+ } else if (channelUID.getId().startsWith(CHANNEL_GROUP_LAST_EVENT_PREFIX)) {
+ return getLastEventChannelState(channelUID, camera);
+ } else {
+ logger.error("Unsupported channelId '{}'", channelUID.getId());
+ return UnDefType.UNDEF;
+ }
+ }
+
+ protected State getCameraChannelState(ChannelUID channelUID, Camera camera) {
+ switch (channelUID.getId()) {
+ case CHANNEL_CAMERA_APP_URL:
+ return getAsStringTypeOrNull(camera.getAppUrl());
+ case CHANNEL_CAMERA_AUDIO_INPUT_ENABLED:
+ return getAsOnOffTypeOrNull(camera.isAudioInputEnabled());
+ case CHANNEL_CAMERA_LAST_ONLINE_CHANGE:
+ return getAsDateTimeTypeOrNull(camera.getLastIsOnlineChange());
+ case CHANNEL_CAMERA_PUBLIC_SHARE_ENABLED:
+ return getAsOnOffTypeOrNull(camera.isPublicShareEnabled());
+ case CHANNEL_CAMERA_PUBLIC_SHARE_URL:
+ return getAsStringTypeOrNull(camera.getPublicShareUrl());
+ case CHANNEL_CAMERA_SNAPSHOT_URL:
+ return getAsStringTypeOrNull(camera.getSnapshotUrl());
+ case CHANNEL_CAMERA_STREAMING:
+ return getAsOnOffTypeOrNull(camera.isStreaming());
+ case CHANNEL_CAMERA_VIDEO_HISTORY_ENABLED:
+ return getAsOnOffTypeOrNull(camera.isVideoHistoryEnabled());
+ case CHANNEL_CAMERA_WEB_URL:
+ return getAsStringTypeOrNull(camera.getWebUrl());
+ default:
+ logger.error("Unsupported channelId '{}'", channelUID.getId());
+ return UnDefType.UNDEF;
+ }
+ }
+
+ protected State getLastEventChannelState(ChannelUID channelUID, Camera camera) {
+ CameraEvent lastEvent = camera.getLastEvent();
+ if (lastEvent == null) {
+ return UnDefType.NULL;
+ }
+
+ switch (channelUID.getId()) {
+ case CHANNEL_LAST_EVENT_ACTIVITY_ZONES:
+ return getAsStringTypeListOrNull(lastEvent.getActivityZones());
+ case CHANNEL_LAST_EVENT_ANIMATED_IMAGE_URL:
+ return getAsStringTypeOrNull(lastEvent.getAnimatedImageUrl());
+ case CHANNEL_LAST_EVENT_APP_URL:
+ return getAsStringTypeOrNull(lastEvent.getAppUrl());
+ case CHANNEL_LAST_EVENT_END_TIME:
+ return getAsDateTimeTypeOrNull(lastEvent.getEndTime());
+ case CHANNEL_LAST_EVENT_HAS_MOTION:
+ return getAsOnOffTypeOrNull(lastEvent.isHasMotion());
+ case CHANNEL_LAST_EVENT_HAS_PERSON:
+ return getAsOnOffTypeOrNull(lastEvent.isHasPerson());
+ case CHANNEL_LAST_EVENT_HAS_SOUND:
+ return getAsOnOffTypeOrNull(lastEvent.isHasSound());
+ case CHANNEL_LAST_EVENT_IMAGE_URL:
+ return getAsStringTypeOrNull(lastEvent.getImageUrl());
+ case CHANNEL_LAST_EVENT_START_TIME:
+ return getAsDateTimeTypeOrNull(lastEvent.getStartTime());
+ case CHANNEL_LAST_EVENT_URLS_EXPIRE_TIME:
+ return getAsDateTimeTypeOrNull(lastEvent.getUrlsExpireTime());
+ case CHANNEL_LAST_EVENT_WEB_URL:
+ return getAsStringTypeOrNull(lastEvent.getWebUrl());
+ default:
+ logger.error("Unsupported channelId '{}'", channelUID.getId());
+ return UnDefType.UNDEF;
+ }
+ }
+
+ @Override
+ public void handleCommand(ChannelUID channelUID, Command command) {
+ if (REFRESH.equals(command)) {
+ Camera lastUpdate = getLastUpdate();
+ if (lastUpdate != null) {
+ updateState(channelUID, getChannelState(channelUID, lastUpdate));
+ }
+ } else if (CHANNEL_CAMERA_STREAMING.equals(channelUID.getId())) {
+ // Change the mode.
+ if (command instanceof OnOffType) {
+ // Set the mode to be the cmd value.
+ addUpdateRequest("is_streaming", command == OnOffType.ON);
+ }
+ }
+ }
+
+ private void addUpdateRequest(String field, Object value) {
+ addUpdateRequest(NEST_CAMERA_UPDATE_PATH, field, value);
+ }
+
+ @Override
+ protected void update(Camera oldCamera, Camera camera) {
+ logger.debug("Updating {}", getThing().getUID());
+
+ updateLinkedChannels(oldCamera, camera);
+ updateProperty(PROPERTY_FIRMWARE_VERSION, camera.getSoftwareVersion());
+
+ ThingStatus newStatus = camera.isOnline() == null ? ThingStatus.UNKNOWN
+ : camera.isOnline() ? ThingStatus.ONLINE : ThingStatus.OFFLINE;
+ if (newStatus != thing.getStatus()) {
+ updateStatus(newStatus);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import java.util.Properties;
+import java.util.concurrent.TimeUnit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jetty.client.HttpClient;
+import org.eclipse.jetty.client.api.ContentResponse;
+import org.eclipse.jetty.client.api.Request;
+import org.eclipse.jetty.http.HttpHeader;
+import org.eclipse.jetty.http.HttpMethod;
+import org.eclipse.jetty.http.HttpStatus;
+import org.eclipse.jetty.util.ssl.SslContextFactory;
+import org.openhab.binding.nest.internal.NestBindingConstants;
+import org.openhab.binding.nest.internal.exceptions.FailedResolvingNestUrlException;
+import org.openhab.core.io.net.http.HttpUtil;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Supplies resolved redirect URLs of {@link NestBindingConstants#NEST_URL} so they can be used with HTTP clients that
+ * do not pass Authorization headers after redirects like the Jetty client used by {@link HttpUtil}.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Extract resolving redirect URL from NestBridgeHandler into NestRedirectUrlSupplier
+ */
+@NonNullByDefault
+public class NestRedirectUrlSupplier {
+
+ private final Logger logger = LoggerFactory.getLogger(NestRedirectUrlSupplier.class);
+
+ protected String cachedUrl = "";
+
+ protected Properties httpHeaders;
+
+ public NestRedirectUrlSupplier(Properties httpHeaders) {
+ this.httpHeaders = httpHeaders;
+ }
+
+ public String getRedirectUrl() throws FailedResolvingNestUrlException {
+ if (cachedUrl.isEmpty()) {
+ cachedUrl = resolveRedirectUrl();
+ }
+ return cachedUrl;
+ }
+
+ public void resetCache() {
+ cachedUrl = "";
+ }
+
+ /**
+ * Resolves the redirect URL for calls using the {@link NestBindingConstants#NEST_URL}.
+ *
+ * The Jetty client used by {@link HttpUtil} will not pass the Authorization header after a redirect resulting in
+ * "401 Unauthorized error" issues.
+ *
+ * Note that this workaround currently does not use any configured proxy like {@link HttpUtil} does.
+ *
+ * @see https://developers.nest.com/documentation/cloud/how-to-handle-redirects
+ */
+ private String resolveRedirectUrl() throws FailedResolvingNestUrlException {
+ HttpClient httpClient = new HttpClient(new SslContextFactory());
+ httpClient.setFollowRedirects(false);
+
+ Request request = httpClient.newRequest(NestBindingConstants.NEST_URL).method(HttpMethod.GET).timeout(30,
+ TimeUnit.SECONDS);
+ for (String httpHeaderKey : httpHeaders.stringPropertyNames()) {
+ request.header(httpHeaderKey, httpHeaders.getProperty(httpHeaderKey));
+ }
+
+ ContentResponse response;
+ try {
+ httpClient.start();
+ response = request.send();
+ httpClient.stop();
+ } catch (Exception e) {
+ throw new FailedResolvingNestUrlException("Failed to resolve redirect URL: " + e.getMessage(), e);
+ }
+
+ int status = response.getStatus();
+ String redirectUrl = response.getHeaders().get(HttpHeader.LOCATION);
+
+ if (status != HttpStatus.TEMPORARY_REDIRECT_307) {
+ logger.debug("Redirect status: {}", status);
+ logger.debug("Redirect response: {}", response.getContentAsString());
+ throw new FailedResolvingNestUrlException("Failed to get redirect URL, expected status "
+ + HttpStatus.TEMPORARY_REDIRECT_307 + " but was " + status);
+ } else if (redirectUrl == null || redirectUrl.isEmpty()) {
+ throw new FailedResolvingNestUrlException("Redirect URL is empty");
+ }
+
+ redirectUrl = redirectUrl.endsWith("/") ? redirectUrl.substring(0, redirectUrl.length() - 1) : redirectUrl;
+ logger.debug("Redirect URL: {}", redirectUrl);
+ return redirectUrl;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.core.thing.Thing.PROPERTY_FIRMWARE_VERSION;
+import static org.openhab.core.types.RefreshType.REFRESH;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.binding.nest.internal.data.SmokeDetector;
+import org.openhab.binding.nest.internal.data.SmokeDetector.BatteryHealth;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.types.Command;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * The smoke detector handler, it handles the data from Nest for the smoke detector.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Handle channel refresh command
+ */
+@NonNullByDefault
+public class NestSmokeDetectorHandler extends NestBaseHandler<SmokeDetector> {
+ private final Logger logger = LoggerFactory.getLogger(NestSmokeDetectorHandler.class);
+
+ public NestSmokeDetectorHandler(Thing thing) {
+ super(thing, SmokeDetector.class);
+ }
+
+ @Override
+ protected State getChannelState(ChannelUID channelUID, SmokeDetector smokeDetector) {
+ switch (channelUID.getId()) {
+ case CHANNEL_CO_ALARM_STATE:
+ return getAsStringTypeOrNull(smokeDetector.getCoAlarmState());
+ case CHANNEL_LAST_CONNECTION:
+ return getAsDateTimeTypeOrNull(smokeDetector.getLastConnection());
+ case CHANNEL_LAST_MANUAL_TEST_TIME:
+ return getAsDateTimeTypeOrNull(smokeDetector.getLastManualTestTime());
+ case CHANNEL_LOW_BATTERY:
+ return getAsOnOffTypeOrNull(smokeDetector.getBatteryHealth() == null ? null
+ : smokeDetector.getBatteryHealth() == BatteryHealth.REPLACE);
+ case CHANNEL_MANUAL_TEST_ACTIVE:
+ return getAsOnOffTypeOrNull(smokeDetector.isManualTestActive());
+ case CHANNEL_SMOKE_ALARM_STATE:
+ return getAsStringTypeOrNull(smokeDetector.getSmokeAlarmState());
+ case CHANNEL_UI_COLOR_STATE:
+ return getAsStringTypeOrNull(smokeDetector.getUiColorState());
+ default:
+ logger.error("Unsupported channelId '{}'", channelUID.getId());
+ return UnDefType.UNDEF;
+ }
+ }
+
+ /**
+ * Handles any incoming command requests.
+ */
+ @Override
+ public void handleCommand(ChannelUID channelUID, Command command) {
+ if (REFRESH.equals(command)) {
+ SmokeDetector lastUpdate = getLastUpdate();
+ if (lastUpdate != null) {
+ updateState(channelUID, getChannelState(channelUID, lastUpdate));
+ }
+ }
+ }
+
+ @Override
+ protected void update(SmokeDetector oldSmokeDetector, SmokeDetector smokeDetector) {
+ logger.debug("Updating {}", getThing().getUID());
+
+ updateLinkedChannels(oldSmokeDetector, smokeDetector);
+ updateProperty(PROPERTY_FIRMWARE_VERSION, smokeDetector.getSoftwareVersion());
+
+ ThingStatus newStatus = smokeDetector.isOnline() == null ? ThingStatus.UNKNOWN
+ : smokeDetector.isOnline() ? ThingStatus.ONLINE : ThingStatus.OFFLINE;
+ if (newStatus != thing.getStatus()) {
+ updateStatus(newStatus);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.core.types.RefreshType.REFRESH;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.config.NestStructureConfiguration;
+import org.openhab.binding.nest.internal.data.Structure;
+import org.openhab.binding.nest.internal.data.Structure.HomeAwayState;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.types.Command;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Deals with the structures on the Nest API, turning them into a thing in openHAB.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Handle channel refresh command
+ */
+@NonNullByDefault
+public class NestStructureHandler extends NestBaseHandler<Structure> {
+ private final Logger logger = LoggerFactory.getLogger(NestStructureHandler.class);
+
+ private @Nullable String structureId;
+
+ public NestStructureHandler(Thing thing) {
+ super(thing, Structure.class);
+ }
+
+ @Override
+ protected State getChannelState(ChannelUID channelUID, Structure structure) {
+ switch (channelUID.getId()) {
+ case CHANNEL_AWAY:
+ return getAsStringTypeOrNull(structure.getAway());
+ case CHANNEL_CO_ALARM_STATE:
+ return getAsStringTypeOrNull(structure.getCoAlarmState());
+ case CHANNEL_COUNTRY_CODE:
+ return getAsStringTypeOrNull(structure.getCountryCode());
+ case CHANNEL_ETA_BEGIN:
+ return getAsDateTimeTypeOrNull(structure.getEtaBegin());
+ case CHANNEL_PEAK_PERIOD_END_TIME:
+ return getAsDateTimeTypeOrNull(structure.getPeakPeriodEndTime());
+ case CHANNEL_PEAK_PERIOD_START_TIME:
+ return getAsDateTimeTypeOrNull(structure.getPeakPeriodStartTime());
+ case CHANNEL_POSTAL_CODE:
+ return getAsStringTypeOrNull(structure.getPostalCode());
+ case CHANNEL_RUSH_HOUR_REWARDS_ENROLLMENT:
+ return getAsOnOffTypeOrNull(structure.isRhrEnrollment());
+ case CHANNEL_SECURITY_STATE:
+ return getAsStringTypeOrNull(structure.getWwnSecurityState());
+ case CHANNEL_SMOKE_ALARM_STATE:
+ return getAsStringTypeOrNull(structure.getSmokeAlarmState());
+ case CHANNEL_TIME_ZONE:
+ return getAsStringTypeOrNull(structure.getTimeZone());
+ default:
+ logger.error("Unsupported channelId '{}'", channelUID.getId());
+ return UnDefType.UNDEF;
+ }
+ }
+
+ @Override
+ public String getId() {
+ return getStructureId();
+ }
+
+ private String getStructureId() {
+ String localStructureId = structureId;
+ if (localStructureId == null) {
+ localStructureId = getConfigAs(NestStructureConfiguration.class).structureId;
+ structureId = localStructureId;
+ }
+ return localStructureId;
+ }
+
+ /**
+ * Handles updating the details on this structure by sending the request all the way
+ * to Nest.
+ *
+ * @param channelUID the channel to update
+ * @param command the command to apply
+ */
+ @Override
+ public void handleCommand(ChannelUID channelUID, Command command) {
+ if (REFRESH.equals(command)) {
+ Structure lastUpdate = getLastUpdate();
+ if (lastUpdate != null) {
+ updateState(channelUID, getChannelState(channelUID, lastUpdate));
+ }
+ } else if (CHANNEL_AWAY.equals(channelUID.getId())) {
+ // Change the home/away state.
+ if (command instanceof StringType) {
+ StringType cmd = (StringType) command;
+ // Set the mode to be the cmd value.
+ addUpdateRequest(NEST_STRUCTURE_UPDATE_PATH, "away", HomeAwayState.valueOf(cmd.toString()));
+ }
+ }
+ }
+
+ @Override
+ protected void update(Structure oldStructure, Structure structure) {
+ logger.debug("Updating {}", getThing().getUID());
+
+ updateLinkedChannels(oldStructure, structure);
+
+ if (ThingStatus.ONLINE != thing.getStatus()) {
+ updateStatus(ThingStatus.ONLINE);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.handler;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.core.library.unit.SIUnits.CELSIUS;
+import static org.openhab.core.thing.Thing.PROPERTY_FIRMWARE_VERSION;
+import static org.openhab.core.types.RefreshType.REFRESH;
+
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+
+import javax.measure.Unit;
+import javax.measure.quantity.Temperature;
+import javax.measure.quantity.Time;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.data.Thermostat;
+import org.openhab.binding.nest.internal.data.Thermostat.Mode;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.QuantityType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.library.unit.SmartHomeUnits;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.types.Command;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * The {@link NestThermostatHandler} is responsible for handling commands, which are
+ * sent to one of the channels for the thermostat.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Handle channel refresh command
+ */
+@NonNullByDefault
+public class NestThermostatHandler extends NestBaseHandler<Thermostat> {
+ private final Logger logger = LoggerFactory.getLogger(NestThermostatHandler.class);
+
+ public NestThermostatHandler(Thing thing) {
+ super(thing, Thermostat.class);
+ }
+
+ @Override
+ protected State getChannelState(ChannelUID channelUID, Thermostat thermostat) {
+ switch (channelUID.getId()) {
+ case CHANNEL_CAN_COOL:
+ return getAsOnOffTypeOrNull(thermostat.isCanCool());
+ case CHANNEL_CAN_HEAT:
+ return getAsOnOffTypeOrNull(thermostat.isCanHeat());
+ case CHANNEL_ECO_MAX_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getEcoTemperatureHigh(), thermostat.getTemperatureUnit());
+ case CHANNEL_ECO_MIN_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getEcoTemperatureLow(), thermostat.getTemperatureUnit());
+ case CHANNEL_FAN_TIMER_ACTIVE:
+ return getAsOnOffTypeOrNull(thermostat.isFanTimerActive());
+ case CHANNEL_FAN_TIMER_DURATION:
+ return getAsQuantityTypeOrNull(thermostat.getFanTimerDuration(), SmartHomeUnits.MINUTE);
+ case CHANNEL_FAN_TIMER_TIMEOUT:
+ return getAsDateTimeTypeOrNull(thermostat.getFanTimerTimeout());
+ case CHANNEL_HAS_FAN:
+ return getAsOnOffTypeOrNull(thermostat.isHasFan());
+ case CHANNEL_HAS_LEAF:
+ return getAsOnOffTypeOrNull(thermostat.isHasLeaf());
+ case CHANNEL_HUMIDITY:
+ return getAsQuantityTypeOrNull(thermostat.getHumidity(), SmartHomeUnits.PERCENT);
+ case CHANNEL_LAST_CONNECTION:
+ return getAsDateTimeTypeOrNull(thermostat.getLastConnection());
+ case CHANNEL_LOCKED:
+ return getAsOnOffTypeOrNull(thermostat.isLocked());
+ case CHANNEL_LOCKED_MAX_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getLockedTempMax(), thermostat.getTemperatureUnit());
+ case CHANNEL_LOCKED_MIN_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getLockedTempMin(), thermostat.getTemperatureUnit());
+ case CHANNEL_MAX_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getTargetTemperatureHigh(), thermostat.getTemperatureUnit());
+ case CHANNEL_MIN_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getTargetTemperatureLow(), thermostat.getTemperatureUnit());
+ case CHANNEL_MODE:
+ return getAsStringTypeOrNull(thermostat.getMode());
+ case CHANNEL_PREVIOUS_MODE:
+ Mode previousMode = thermostat.getPreviousHvacMode() != null ? thermostat.getPreviousHvacMode()
+ : thermostat.getMode();
+ return getAsStringTypeOrNull(previousMode);
+ case CHANNEL_STATE:
+ return getAsStringTypeOrNull(thermostat.getHvacState());
+ case CHANNEL_SET_POINT:
+ return getAsQuantityTypeOrNull(thermostat.getTargetTemperature(), thermostat.getTemperatureUnit());
+ case CHANNEL_SUNLIGHT_CORRECTION_ACTIVE:
+ return getAsOnOffTypeOrNull(thermostat.isSunlightCorrectionActive());
+ case CHANNEL_SUNLIGHT_CORRECTION_ENABLED:
+ return getAsOnOffTypeOrNull(thermostat.isSunlightCorrectionEnabled());
+ case CHANNEL_TEMPERATURE:
+ return getAsQuantityTypeOrNull(thermostat.getAmbientTemperature(), thermostat.getTemperatureUnit());
+ case CHANNEL_TIME_TO_TARGET:
+ return getAsQuantityTypeOrNull(thermostat.getTimeToTarget(), SmartHomeUnits.MINUTE);
+ case CHANNEL_USING_EMERGENCY_HEAT:
+ return getAsOnOffTypeOrNull(thermostat.isUsingEmergencyHeat());
+ default:
+ logger.error("Unsupported channelId '{}'", channelUID.getId());
+ return UnDefType.UNDEF;
+ }
+ }
+
+ /**
+ * Handle the command to do things to the thermostat, this will change the
+ * value of a channel by sending the request to Nest.
+ */
+ @Override
+ @SuppressWarnings("unchecked")
+ public void handleCommand(ChannelUID channelUID, Command command) {
+ if (REFRESH.equals(command)) {
+ Thermostat lastUpdate = getLastUpdate();
+ if (lastUpdate != null) {
+ updateState(channelUID, getChannelState(channelUID, lastUpdate));
+ }
+ } else if (CHANNEL_FAN_TIMER_ACTIVE.equals(channelUID.getId())) {
+ if (command instanceof OnOffType) {
+ // Update fan timer active to the command value
+ addUpdateRequest("fan_timer_active", command == OnOffType.ON);
+ }
+ } else if (CHANNEL_FAN_TIMER_DURATION.equals(channelUID.getId())) {
+ if (command instanceof QuantityType) {
+ // Update fan timer duration to the command value
+ QuantityType<Time> minuteQuantity = ((QuantityType<Time>) command).toUnit(SmartHomeUnits.MINUTE);
+ if (minuteQuantity != null) {
+ addUpdateRequest("fan_timer_duration", minuteQuantity.intValue());
+ }
+ }
+ } else if (CHANNEL_MAX_SET_POINT.equals(channelUID.getId())) {
+ if (command instanceof QuantityType) {
+ // Update maximum set point to the command value
+ addTemperatureUpdateRequest("target_temperature_high_c", "target_temperature_high_f",
+ (QuantityType<Temperature>) command);
+ }
+ } else if (CHANNEL_MIN_SET_POINT.equals(channelUID.getId())) {
+ if (command instanceof QuantityType) {
+ // Update minimum set point to the command value
+ addTemperatureUpdateRequest("target_temperature_low_c", "target_temperature_low_f",
+ (QuantityType<Temperature>) command);
+ }
+ } else if (CHANNEL_MODE.equals(channelUID.getId())) {
+ if (command instanceof StringType) {
+ // Update the HVAC mode to the command value
+ addUpdateRequest("hvac_mode", Mode.valueOf(((StringType) command).toString()));
+ }
+ } else if (CHANNEL_SET_POINT.equals(channelUID.getId())) {
+ if (command instanceof QuantityType) {
+ // Update set point to the command value
+ addTemperatureUpdateRequest("target_temperature_c", "target_temperature_f",
+ (QuantityType<Temperature>) command);
+ }
+ }
+ }
+
+ private void addUpdateRequest(String field, Object value) {
+ addUpdateRequest(NEST_THERMOSTAT_UPDATE_PATH, field, value);
+ }
+
+ private void addTemperatureUpdateRequest(String celsiusField, String fahrenheitField,
+ QuantityType<Temperature> quantity) {
+ Unit<Temperature> unit = getTemperatureUnit(quantity.getUnit());
+ BigDecimal value = quantityToRoundedTemperature(quantity, unit);
+ if (value != null) {
+ addUpdateRequest(NEST_THERMOSTAT_UPDATE_PATH, unit == CELSIUS ? celsiusField : fahrenheitField, value);
+ }
+ }
+
+ private Unit<Temperature> getTemperatureUnit(Unit<Temperature> fallbackUnit) {
+ Thermostat lastUpdate = getLastUpdate();
+ if (lastUpdate != null && lastUpdate.getTemperatureUnit() != null) {
+ return lastUpdate.getTemperatureUnit();
+ }
+
+ return fallbackUnit;
+ }
+
+ private @Nullable BigDecimal quantityToRoundedTemperature(QuantityType<Temperature> quantity,
+ Unit<Temperature> unit) throws IllegalArgumentException {
+ QuantityType<Temperature> temparatureQuantity = quantity.toUnit(unit);
+ if (temparatureQuantity == null) {
+ return null;
+ }
+
+ BigDecimal value = temparatureQuantity.toBigDecimal();
+ BigDecimal increment = CELSIUS == unit ? new BigDecimal("0.5") : new BigDecimal("1");
+ BigDecimal divisor = value.divide(increment, 0, RoundingMode.HALF_UP);
+ return divisor.multiply(increment);
+ }
+
+ @Override
+ protected void update(Thermostat oldThermostat, Thermostat thermostat) {
+ logger.debug("Updating {}", getThing().getUID());
+
+ updateLinkedChannels(oldThermostat, thermostat);
+ updateProperty(PROPERTY_FIRMWARE_VERSION, thermostat.getSoftwareVersion());
+
+ ThingStatus newStatus = thermostat.isOnline() == null ? ThingStatus.UNKNOWN
+ : thermostat.isOnline() ? ThingStatus.ONLINE : ThingStatus.OFFLINE;
+ if (newStatus != thing.getStatus()) {
+ updateStatus(newStatus);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.listener;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.binding.nest.internal.data.TopLevelData;
+import org.openhab.binding.nest.internal.rest.NestStreamingRestClient;
+
+/**
+ * Interface for listeners of events generated by the {@link NestStreamingRestClient}.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Replace polling with REST streaming
+ */
+@NonNullByDefault
+public interface NestStreamingDataListener {
+
+ /**
+ * Authorization has been revoked for a token.
+ */
+ void onAuthorizationRevoked(String token);
+
+ /**
+ * The client successfully established a connection.
+ */
+ void onConnected();
+
+ /**
+ * The client was disconnected.
+ */
+ void onDisconnected();
+
+ /**
+ * An error message was published.
+ */
+ void onError(String message);
+
+ /**
+ * Initial {@link TopLevelData} or an update is sent.
+ */
+ void onNewTopLevelData(TopLevelData data);
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.listener;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Used to track incoming data for Nest things.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+@NonNullByDefault
+public interface NestThingDataListener<T> {
+
+ /**
+ * An initial value for the data was received or the value is send again due to a refresh.
+ *
+ * @param data the data
+ */
+ void onNewData(T data);
+
+ /**
+ * Existing data was updated to a new value.
+ *
+ * @param oldData the previous value
+ * @param data the current value
+ */
+ void onUpdatedData(T oldData, T data);
+
+ /**
+ * A Nest thing which previously had data is missing. E.g. it was removed from the account.
+ *
+ * @param nestId identifies the Nest thing
+ */
+ void onMissingData(String nestId);
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.rest;
+
+import java.io.IOException;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.binding.nest.internal.NestBindingConstants;
+import org.openhab.binding.nest.internal.NestUtils;
+import org.openhab.binding.nest.internal.config.NestBridgeConfiguration;
+import org.openhab.binding.nest.internal.data.AccessTokenData;
+import org.openhab.binding.nest.internal.exceptions.InvalidAccessTokenException;
+import org.openhab.core.io.net.http.HttpUtil;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Retrieves the Nest access token using the OAuth 2.0 protocol using pin-based authorization.
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Improve exception handling
+ */
+@NonNullByDefault
+public class NestAuthorizer {
+ private final Logger logger = LoggerFactory.getLogger(NestAuthorizer.class);
+
+ private final NestBridgeConfiguration config;
+
+ /**
+ * Create the helper class for the Nest access token. Also creates the folder
+ * to put the access token data in if it does not already exist.
+ *
+ * @param config The configuration to use for the token
+ */
+ public NestAuthorizer(NestBridgeConfiguration config) {
+ this.config = config;
+ }
+
+ /**
+ * Get the current access token, refreshing if needed.
+ *
+ * @throws InvalidAccessTokenException thrown when the access token is invalid and could not be refreshed
+ */
+ public String getNewAccessToken() throws InvalidAccessTokenException {
+ try {
+ String pincode = config.pincode;
+ if (pincode == null || pincode.isBlank()) {
+ throw new InvalidAccessTokenException("Pincode is empty");
+ }
+
+ // @formatter:off
+ StringBuilder urlBuilder = new StringBuilder(NestBindingConstants.NEST_ACCESS_TOKEN_URL)
+ .append("?client_id=")
+ .append(config.productId)
+ .append("&client_secret=")
+ .append(config.productSecret)
+ .append("&code=")
+ .append(pincode)
+ .append("&grant_type=authorization_code");
+ // @formatter:on
+
+ logger.debug("Requesting access token from URL: {}", urlBuilder);
+
+ String responseContentAsString = HttpUtil.executeUrl("POST", urlBuilder.toString(), null, null,
+ "application/x-www-form-urlencoded", 10_000);
+
+ AccessTokenData data = NestUtils.fromJson(responseContentAsString, AccessTokenData.class);
+ logger.debug("Received: {}", data);
+
+ String accessToken = data.getAccessToken();
+ if (accessToken == null || accessToken.isBlank()) {
+ throw new InvalidAccessTokenException("Pincode to obtain access token is already used or invalid)");
+ }
+ return accessToken;
+ } catch (IOException e) {
+ throw new InvalidAccessTokenException("Access token request failed", e);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.rest;
+
+import java.io.IOException;
+
+import javax.ws.rs.client.ClientRequestContext;
+import javax.ws.rs.client.ClientRequestFilter;
+import javax.ws.rs.core.HttpHeaders;
+import javax.ws.rs.core.MultivaluedMap;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+
+/**
+ * Inserts Authorization and Cache-Control headers for requests on the streaming REST API.
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Replace polling with REST streaming
+ */
+@NonNullByDefault
+public class NestStreamingRequestFilter implements ClientRequestFilter {
+ private final String accessToken;
+
+ public NestStreamingRequestFilter(String accessToken) {
+ this.accessToken = accessToken;
+ }
+
+ @Override
+ public void filter(@Nullable ClientRequestContext requestContext) throws IOException {
+ if (requestContext != null) {
+ MultivaluedMap<String, Object> headers = requestContext.getHeaders();
+ headers.putSingle(HttpHeaders.AUTHORIZATION, "Bearer " + accessToken);
+ headers.putSingle(HttpHeaders.CACHE_CONTROL, "no-cache");
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.rest;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.KEEP_ALIVE_MILLIS;
+
+import java.util.List;
+import java.util.concurrent.CopyOnWriteArrayList;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+
+import javax.ws.rs.client.Client;
+import javax.ws.rs.client.ClientBuilder;
+import javax.ws.rs.sse.InboundSseEvent;
+import javax.ws.rs.sse.SseEventSource;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.NestUtils;
+import org.openhab.binding.nest.internal.data.TopLevelData;
+import org.openhab.binding.nest.internal.data.TopLevelStreamingData;
+import org.openhab.binding.nest.internal.exceptions.FailedResolvingNestUrlException;
+import org.openhab.binding.nest.internal.handler.NestRedirectUrlSupplier;
+import org.openhab.binding.nest.internal.listener.NestStreamingDataListener;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * A client that generates events based on Nest streaming REST API Server-Sent Events (SSE).
+ *
+ * @author Wouter Born - Initial contribution
+ * @author Wouter Born - Replace polling with REST streaming
+ */
+@NonNullByDefault
+public class NestStreamingRestClient {
+
+ // Assume connection timeout when 2 keep alive message should have been received
+ private static final long CONNECTION_TIMEOUT_MILLIS = 2 * KEEP_ALIVE_MILLIS + KEEP_ALIVE_MILLIS / 2;
+
+ public static final String AUTH_REVOKED = "auth_revoked";
+ public static final String ERROR = "error";
+ public static final String KEEP_ALIVE = "keep-alive";
+ public static final String OPEN = "open";
+ public static final String PUT = "put";
+
+ private final Logger logger = LoggerFactory.getLogger(NestStreamingRestClient.class);
+
+ private final String accessToken;
+ private final ClientBuilder clientBuilder;
+ private final SseEventSourceFactory eventSourceFactory;
+ private final NestRedirectUrlSupplier redirectUrlSupplier;
+ private final ScheduledExecutorService scheduler;
+
+ private final Object startStopLock = new Object();
+ private final List<NestStreamingDataListener> listeners = new CopyOnWriteArrayList<>();
+
+ private @Nullable ScheduledFuture<?> checkConnectionJob;
+ private boolean connected;
+ private @Nullable SseEventSource eventSource;
+ private long lastEventTimestamp;
+ private @Nullable TopLevelData lastReceivedTopLevelData;
+
+ public NestStreamingRestClient(String accessToken, ClientBuilder clientBuilder,
+ SseEventSourceFactory eventSourceFactory, NestRedirectUrlSupplier redirectUrlSupplier,
+ ScheduledExecutorService scheduler) {
+ this.accessToken = accessToken;
+ this.clientBuilder = clientBuilder;
+ this.eventSourceFactory = eventSourceFactory;
+ this.redirectUrlSupplier = redirectUrlSupplier;
+ this.scheduler = scheduler;
+ }
+
+ private SseEventSource createEventSource() throws FailedResolvingNestUrlException {
+ Client client = clientBuilder.register(new NestStreamingRequestFilter(accessToken)).build();
+ SseEventSource eventSource = eventSourceFactory.newSource(client.target(redirectUrlSupplier.getRedirectUrl()));
+ eventSource.register(this::onEvent, this::onError);
+ return eventSource;
+ }
+
+ private void checkConnection() {
+ long millisSinceLastEvent = System.currentTimeMillis() - lastEventTimestamp;
+ if (millisSinceLastEvent > CONNECTION_TIMEOUT_MILLIS) {
+ logger.debug("Check: Disconnected from streaming events, millisSinceLastEvent={}", millisSinceLastEvent);
+ synchronized (startStopLock) {
+ stopCheckConnectionJob(false);
+ if (connected) {
+ connected = false;
+ listeners.forEach(listener -> listener.onDisconnected());
+ }
+ redirectUrlSupplier.resetCache();
+ reopenEventSource();
+ startCheckConnectionJob();
+ }
+ } else {
+ logger.debug("Check: Receiving streaming events, millisSinceLastEvent={}", millisSinceLastEvent);
+ }
+ }
+
+ /**
+ * Closes the existing EventSource and opens a new EventSource as workaround when the EventSource fails to reconnect
+ * itself.
+ */
+ private void reopenEventSource() {
+ try {
+ logger.debug("Reopening EventSource");
+ closeEventSource(10, TimeUnit.SECONDS);
+
+ logger.debug("Opening new EventSource");
+ SseEventSource localEventSource = createEventSource();
+ localEventSource.open();
+
+ eventSource = localEventSource;
+ } catch (FailedResolvingNestUrlException e) {
+ logger.debug("Failed to resolve Nest redirect URL while opening new EventSource");
+ }
+ }
+
+ public void start() {
+ synchronized (startStopLock) {
+ logger.debug("Opening EventSource and starting checkConnection job");
+ reopenEventSource();
+ startCheckConnectionJob();
+ logger.debug("Started");
+ }
+ }
+
+ public void stop() {
+ synchronized (startStopLock) {
+ logger.debug("Closing EventSource and stopping checkConnection job");
+ stopCheckConnectionJob(true);
+ closeEventSource(0, TimeUnit.SECONDS);
+ logger.debug("Stopped");
+ }
+ }
+
+ private void closeEventSource(long timeout, TimeUnit timeoutUnit) {
+ SseEventSource localEventSource = eventSource;
+ if (localEventSource != null) {
+ if (!localEventSource.isOpen()) {
+ logger.debug("Existing EventSource is already closed");
+ } else if (localEventSource.close(timeout, timeoutUnit)) {
+ logger.debug("Succesfully closed existing EventSource");
+ } else {
+ logger.debug("Failed to close existing EventSource");
+ }
+ eventSource = null;
+ }
+ }
+
+ private void startCheckConnectionJob() {
+ ScheduledFuture<?> localCheckConnectionJob = checkConnectionJob;
+ if (localCheckConnectionJob == null || localCheckConnectionJob.isCancelled()) {
+ checkConnectionJob = scheduler.scheduleWithFixedDelay(this::checkConnection, CONNECTION_TIMEOUT_MILLIS,
+ KEEP_ALIVE_MILLIS, TimeUnit.MILLISECONDS);
+ }
+ }
+
+ private void stopCheckConnectionJob(boolean mayInterruptIfRunning) {
+ ScheduledFuture<?> localCheckConnectionJob = checkConnectionJob;
+ if (localCheckConnectionJob != null && !localCheckConnectionJob.isCancelled()) {
+ localCheckConnectionJob.cancel(mayInterruptIfRunning);
+ checkConnectionJob = null;
+ }
+ }
+
+ public boolean addStreamingDataListener(NestStreamingDataListener listener) {
+ return listeners.add(listener);
+ }
+
+ public boolean removeStreamingDataListener(NestStreamingDataListener listener) {
+ return listeners.remove(listener);
+ }
+
+ public @Nullable TopLevelData getLastReceivedTopLevelData() {
+ return lastReceivedTopLevelData;
+ }
+
+ private void onEvent(InboundSseEvent inboundEvent) {
+ try {
+ lastEventTimestamp = System.currentTimeMillis();
+
+ String name = inboundEvent.getName();
+ String data = inboundEvent.readData();
+
+ logger.debug("Received '{}' event, data: {}", name, data);
+
+ if (!connected) {
+ logger.debug("Connected to streaming events");
+ connected = true;
+ listeners.forEach(listener -> listener.onConnected());
+ }
+
+ if (AUTH_REVOKED.equals(name)) {
+ logger.debug("API authorization has been revoked for access token: {}", data);
+ listeners.forEach(listener -> listener.onAuthorizationRevoked(data));
+ } else if (ERROR.equals(name)) {
+ logger.warn("Error occurred: {}", data);
+ listeners.forEach(listener -> listener.onError(data));
+ } else if (KEEP_ALIVE.equals(name)) {
+ logger.debug("Received message to keep connection alive");
+ } else if (OPEN.equals(name)) {
+ logger.debug("Event stream opened");
+ } else if (PUT.equals(name)) {
+ logger.debug("Data has changed (or initial data sent)");
+ TopLevelData topLevelData = NestUtils.fromJson(data, TopLevelStreamingData.class).getData();
+ lastReceivedTopLevelData = topLevelData;
+ listeners.forEach(listener -> listener.onNewTopLevelData(topLevelData));
+ } else {
+ logger.debug("Received unhandled event with name '{}' and data '{}'", name, data);
+ }
+ } catch (Exception e) {
+ // catch exceptions here otherwise they will be swallowed by the implementation
+ logger.warn("An exception occurred while processing the inbound event", e);
+ }
+ }
+
+ private void onError(Throwable error) {
+ logger.debug("Error occurred while receiving events", error);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.rest;
+
+import java.util.HashMap;
+import java.util.Map;
+
+/**
+ * Contains the data needed to do an update request back to Nest.
+ *
+ * @author David Bennett - Initial contribution
+ */
+public class NestUpdateRequest {
+ private final String updatePath;
+ private final Map<String, Object> values;
+
+ private NestUpdateRequest(Builder builder) {
+ this.updatePath = builder.basePath + builder.identifier;
+ this.values = builder.values;
+ }
+
+ public String getUpdatePath() {
+ return updatePath;
+ }
+
+ public Map<String, Object> getValues() {
+ return values;
+ }
+
+ public static class Builder {
+ private String basePath;
+ private String identifier;
+ private Map<String, Object> values = new HashMap<>();
+
+ public Builder withBasePath(String basePath) {
+ this.basePath = basePath;
+ return this;
+ }
+
+ public Builder withIdentifier(String identifier) {
+ this.identifier = identifier;
+ return this;
+ }
+
+ public Builder withAdditionalValue(String field, Object value) {
+ values.put(field, value);
+ return this;
+ }
+
+ public NestUpdateRequest build() {
+ return new NestUpdateRequest(this);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.update;
+
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.function.Supplier;
+import java.util.stream.Collectors;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.data.NestIdentifiable;
+import org.openhab.binding.nest.internal.data.TopLevelData;
+import org.openhab.binding.nest.internal.listener.NestThingDataListener;
+
+/**
+ * Handles all Nest data updates through delegation to the {@link NestUpdateHandler} for the respective data type.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+@NonNullByDefault
+public class NestCompositeUpdateHandler {
+
+ private final Supplier<Set<String>> presentNestIdsSupplier;
+ private final Map<Class<?>, @Nullable NestUpdateHandler<?>> updateHandlersMap = new ConcurrentHashMap<>();
+
+ public NestCompositeUpdateHandler(Supplier<Set<String>> presentNestIdsSupplier) {
+ this.presentNestIdsSupplier = presentNestIdsSupplier;
+ }
+
+ public <T> boolean addListener(Class<T> dataClass, NestThingDataListener<T> listener) {
+ return getOrCreateUpdateHandler(dataClass).addListener(listener);
+ }
+
+ public <T> boolean addListener(Class<T> dataClass, String nestId, NestThingDataListener<T> listener) {
+ return getOrCreateUpdateHandler(dataClass).addListener(nestId, listener);
+ }
+
+ private Set<String> findMissingNestIds(Set<NestIdentifiable> updates) {
+ Set<String> nestIds = updates.stream().map(u -> u.getId()).collect(Collectors.toSet());
+ Set<String> missingNestIds = presentNestIdsSupplier.get();
+ missingNestIds.removeAll(nestIds);
+ return missingNestIds;
+ }
+
+ public @Nullable <T> T getLastUpdate(Class<T> dataClass, String nestId) {
+ return getOrCreateUpdateHandler(dataClass).getLastUpdate(nestId);
+ }
+
+ public <T> List<T> getLastUpdates(Class<T> dataClass) {
+ return getOrCreateUpdateHandler(dataClass).getLastUpdates();
+ }
+
+ private Set<NestIdentifiable> getNestUpdates(TopLevelData data) {
+ Set<NestIdentifiable> updates = new HashSet<>();
+ if (data.getDevices() != null) {
+ if (data.getDevices().getCameras() != null) {
+ updates.addAll(data.getDevices().getCameras().values());
+ }
+ if (data.getDevices().getSmokeCoAlarms() != null) {
+ updates.addAll(data.getDevices().getSmokeCoAlarms().values());
+ }
+ if (data.getDevices().getThermostats() != null) {
+ updates.addAll(data.getDevices().getThermostats().values());
+ }
+ }
+ if (data.getStructures() != null) {
+ updates.addAll(data.getStructures().values());
+ }
+ return updates;
+ }
+
+ @SuppressWarnings("unchecked")
+ private <T> NestUpdateHandler<T> getOrCreateUpdateHandler(Class<T> dataClass) {
+ NestUpdateHandler<T> handler = (NestUpdateHandler<T>) updateHandlersMap.get(dataClass);
+ if (handler == null) {
+ handler = new NestUpdateHandler<>();
+ updateHandlersMap.put(dataClass, handler);
+ }
+ return handler;
+ }
+
+ @SuppressWarnings("unchecked")
+ public void handleUpdate(TopLevelData data) {
+ Set<NestIdentifiable> updates = getNestUpdates(data);
+ updates.forEach(update -> {
+ Class<NestIdentifiable> updateClass = (Class<NestIdentifiable>) update.getClass();
+ getOrCreateUpdateHandler(updateClass).handleUpdate(updateClass, update.getId(), update);
+ });
+
+ Set<String> missingNestIds = findMissingNestIds(updates);
+ if (!missingNestIds.isEmpty()) {
+ updateHandlersMap.values().forEach(handler -> {
+ if (handler != null) {
+ handler.handleMissingNestIds(missingNestIds);
+ }
+ });
+ }
+ }
+
+ public <T> boolean removeListener(Class<T> dataClass, NestThingDataListener<T> listener) {
+ return getOrCreateUpdateHandler(dataClass).removeListener(listener);
+ }
+
+ public <T> boolean removeListener(Class<T> dataClass, String nestId, NestThingDataListener<T> listener) {
+ return getOrCreateUpdateHandler(dataClass).removeListener(nestId, listener);
+ }
+
+ public void resendLastUpdates() {
+ updateHandlersMap.values().forEach(handler -> {
+ if (handler != null) {
+ handler.resendLastUpdates();
+ }
+ });
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.update;
+
+import java.util.ArrayList;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.concurrent.CopyOnWriteArraySet;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.listener.NestThingDataListener;
+
+/**
+ * Handles the updates of one type of data by notifying listeners of changes and storing the update value.
+ *
+ * @author Wouter Born - Initial contribution
+ *
+ * @param <T> the type of update data
+ */
+@NonNullByDefault
+public class NestUpdateHandler<T> {
+
+ /**
+ * The ID used for listeners that subscribe to any Nest update.
+ */
+ private static final String ANY_ID = "*";
+
+ private final Map<String, @Nullable T> lastUpdates = new ConcurrentHashMap<>();
+ private final Map<String, @Nullable Set<NestThingDataListener<T>>> listenersMap = new ConcurrentHashMap<>();
+
+ public boolean addListener(NestThingDataListener<T> listener) {
+ return addListener(ANY_ID, listener);
+ }
+
+ public boolean addListener(String nestId, NestThingDataListener<T> listener) {
+ return getOrCreateListeners(nestId).add(listener);
+ }
+
+ public @Nullable T getLastUpdate(String nestId) {
+ return lastUpdates.get(nestId);
+ }
+
+ public List<T> getLastUpdates() {
+ return new ArrayList<>(lastUpdates.values());
+ }
+
+ private Set<NestThingDataListener<T>> getListeners(String nestId) {
+ Set<NestThingDataListener<T>> listeners = new HashSet<>();
+ if (listenersMap.get(nestId) != null) {
+ listeners.addAll(listenersMap.get(nestId));
+ }
+ if (listenersMap.get(ANY_ID) != null) {
+ listeners.addAll(listenersMap.get(ANY_ID));
+ }
+ return listeners;
+ }
+
+ private Set<NestThingDataListener<T>> getOrCreateListeners(String nestId) {
+ Set<NestThingDataListener<T>> listeners = listenersMap.get(nestId);
+ if (listeners == null) {
+ listeners = new CopyOnWriteArraySet<>();
+ listenersMap.put(nestId, listeners);
+ }
+ return listeners;
+ }
+
+ public void handleMissingNestIds(Set<String> nestIds) {
+ nestIds.forEach(nestId -> {
+ lastUpdates.remove(nestId);
+ getListeners(nestId).forEach(l -> l.onMissingData(nestId));
+ });
+ }
+
+ public void handleUpdate(Class<T> dataClass, String nestId, T update) {
+ T lastUpdate = getLastUpdate(nestId);
+ lastUpdates.put(nestId, update);
+ notifyListeners(nestId, lastUpdate, update);
+ }
+
+ private void notifyListeners(String nestId, @Nullable T lastUpdate, T update) {
+ Set<NestThingDataListener<T>> listeners = getListeners(nestId);
+ if (lastUpdate == null) {
+ listeners.forEach(l -> l.onNewData(update));
+ } else if (!lastUpdate.equals(update)) {
+ listeners.forEach(l -> l.onUpdatedData(lastUpdate, update));
+ }
+ }
+
+ public boolean removeListener(NestThingDataListener<T> listener) {
+ return removeListener(ANY_ID, listener);
+ }
+
+ public boolean removeListener(String nestId, NestThingDataListener<T> listener) {
+ return getOrCreateListeners(nestId).remove(listener);
+ }
+
+ public void resendLastUpdates() {
+ lastUpdates.forEach((nestId, update) -> notifyListeners(nestId, null, update));
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<binding:binding id="nest" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:binding="https://openhab.org/schemas/binding/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/binding/v1.0.0 https://openhab.org/schemas/binding-1.0.0.xsd">
+
+ <name>Nest Binding</name>
+ <description>Nest connects to the Nest cloud and allows control of the various Nest devices.</description>
+
+</binding:binding>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<config-description:config-descriptions
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:config-description="https://openhab.org/schemas/config-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/config-description/v1.0.0 https://openhab.org/schemas/config-description-1.0.0.xsd">
+
+ <config-description uri="thing-type:nest:account">
+ <parameter-group name="oauth">
+ <label>Nest API OAuth</label>
+ <description>The OAuth parameters used when communicating with the Nest API</description>
+ </parameter-group>
+ <parameter-group name="binding">
+ <label>Binding Settings</label>
+ <description>Local settings</description>
+ </parameter-group>
+
+ <parameter name="productId" type="text" groupName="oauth">
+ <label>Product ID</label>
+ <description>The product ID from the Nest product page</description>
+ <required>true</required>
+ </parameter>
+ <parameter name="productSecret" type="text" groupName="oauth">
+ <label>Product Secret</label>
+ <description>The product secret from the Nest product page</description>
+ <required>true</required>
+ </parameter>
+ <parameter name="pincode" type="text" groupName="oauth">
+ <label>Pincode</label>
+ <description>The single use pincode for obtaining an OAuth access token.
+ Get the pincode by accepting to the terms
+ shown at the product authorization URL.
+ This value is automatically reset when the access token has been obtained</description>
+ </parameter>
+ <parameter name="accessToken" type="text" groupName="oauth">
+ <label>Access Token</label>
+ <description>The access token used for authenticating to the Nest API.
+ It is automatically obtained from Nest when the
+ value is empty and
+ a valid pincode parameter is entered</description>
+ <advanced>true</advanced>
+ </parameter>
+ </config-description>
+
+ <config-description uri="thing-type:nest:device">
+ <parameter name="deviceId" type="text" required="true">
+ <label>Device ID</label>
+ </parameter>
+ </config-description>
+
+ <config-description uri="thing-type:nest:structure">
+ <parameter name="structureId" type="text" required="true">
+ <label>Structure ID</label>
+ </parameter>
+ </config-description>
+
+</config-description:config-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <bridge-type id="account">
+ <label>Nest Account</label>
+ <description>An account for using the Nest REST API</description>
+ <config-description-ref uri="thing-type:nest:account"/>
+ </bridge-type>
+</thing:thing-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <thing-type id="camera">
+ <supported-bridge-type-refs>
+ <bridge-type-ref id="account"/>
+ </supported-bridge-type-refs>
+
+ <label>Nest Cam</label>
+ <description>A Nest Cam registered with your account</description>
+
+ <channel-groups>
+ <channel-group id="camera" typeId="Camera"/>
+ <channel-group id="last_event" typeId="CameraEvent">
+ <label>Last Event</label>
+ <description>Information about the last camera event (requires Nest Aware subscription)</description>
+ </channel-group>
+ </channel-groups>
+
+ <properties>
+ <property name="vendor">Nest</property>
+ </properties>
+
+ <representation-property>deviceId</representation-property>
+
+ <config-description-ref uri="thing-type:nest:device"/>
+ </thing-type>
+</thing:thing-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <!-- Common -->
+ <channel-type id="LastConnection" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Last Connection</label>
+ <description>Timestamp of the last successful interaction with Nest</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <!-- Structure -->
+ <channel-type id="Away">
+ <item-type>String</item-type>
+ <label>Away</label>
+ <description>Away state of the structure</description>
+ <state>
+ <options>
+ <option value="AWAY">Away</option>
+ <option value="HOME">Home</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="CountryCode" advanced="true">
+ <item-type>String</item-type>
+ <label>Country Code</label>
+ <description>Country code of the structure</description>
+ </channel-type>
+
+ <channel-type id="PostalCode" advanced="true">
+ <item-type>String</item-type>
+ <label>Postal Code</label>
+ <description>Postal code of the structure</description>
+ </channel-type>
+
+ <channel-type id="TimeZone">
+ <item-type>String</item-type>
+ <label>Time Zone</label>
+ <description>The time zone for the structure</description>
+ </channel-type>
+
+ <channel-type id="PeakPeriodStartTime" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Peak Period Start Time</label>
+ <description>Peak period start for the Rush Hour Rewards program</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="PeakPeriodEndTime" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Peak Period End Time</label>
+ <description>Peak period end for the Rush Hour Rewards program</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="EtaBegin" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>ETA</label>
+ <description>
+ Estimated time of arrival at home, will setup the heat to turn on and be warm
+ by the time you arrive
+ </description>
+ </channel-type>
+
+ <channel-type id="RushHourRewardsEnrollment">
+ <item-type>Switch</item-type>
+ <label>Rush Hour Rewards</label>
+ <description>If rush hour rewards system is enabled or not</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="SecurityState">
+ <item-type>String</item-type>
+ <label>Security State</label>
+ <description>Security state of the structure</description>
+ <state readOnly="true">
+ <options>
+ <option value="OK">ok</option>
+ <option value="DETER">deter</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <!-- Camera -->
+ <channel-group-type id="Camera">
+ <label>Camera</label>
+ <description>Information about the camera</description>
+ <channels>
+ <channel id="streaming" typeId="Streaming"/>
+ <channel id="audio_input_enabled" typeId="AudioInputEnabled"/>
+ <channel id="public_share_enabled" typeId="PublicShareEnabled"/>
+ <channel id="video_history_enabled" typeId="VideoHistoryEnabled"/>
+ <channel id="app_url" typeId="AppUrl"/>
+ <channel id="snapshot_url" typeId="SnapshotUrl"/>
+ <channel id="public_share_url" typeId="PublicShareUrl"/>
+ <channel id="web_url" typeId="WebUrl"/>
+ <channel id="last_online_change" typeId="LastOnlineChange"/>
+ </channels>
+ </channel-group-type>
+
+ <channel-type id="AudioInputEnabled" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Audio Input Enabled</label>
+ <description>If the audio input is enabled for this camera</description>
+ </channel-type>
+
+ <channel-type id="VideoHistoryEnabled" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Video History Enabled</label>
+ <description>If the video history is enabled for this camera</description>
+ </channel-type>
+
+ <channel-type id="PublicShareEnabled" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Public Share Enabled</label>
+ <description>If the public sharing of this camera is enabled</description>
+ </channel-type>
+
+ <channel-type id="Streaming">
+ <item-type>Switch</item-type>
+ <label>Streaming</label>
+ <description>If the camera is currently streaming</description>
+ </channel-type>
+
+ <channel-type id="WebUrl">
+ <item-type>String</item-type>
+ <label>Web URL</label>
+ <description>The web URL for the camera, allows you to see the camera in a web page</description>
+ </channel-type>
+
+ <channel-type id="PublicShareUrl">
+ <item-type>String</item-type>
+ <label>Public Share URL</label>
+ <description>The publicly available URL for the camera</description>
+ </channel-type>
+
+ <channel-type id="SnapshotUrl" advanced="true">
+ <item-type>String</item-type>
+ <label>Snapshot URL</label>
+ <description>The URL showing a snapshot of the camera</description>
+ </channel-type>
+
+ <channel-type id="AppUrl" advanced="true">
+ <item-type>String</item-type>
+ <label>App URL</label>
+ <description>The app URL for the camera, allows you to see the camera in an app</description>
+ </channel-type>
+
+ <channel-type id="LastOnlineChange" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Last Online Change</label>
+ <description>Timestamp of the last online status change</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-group-type id="CameraEvent">
+ <label>Camera Event</label>
+ <description>Information about the camera event</description>
+ <channels>
+ <channel id="has_motion" typeId="CameraEventHasMotion"/>
+ <channel id="has_sound" typeId="CameraEventHasSound"/>
+ <channel id="has_person" typeId="CameraEventHasPerson"/>
+ <channel id="start_time" typeId="CameraEventStartTime"/>
+ <channel id="end_time" typeId="CameraEventEndTime"/>
+ <channel id="urls_expire_time" typeId="CameraEventUrlsExpireTime"/>
+ <channel id="animated_image_url" typeId="CameraEventAnimatedImageUrl"/>
+ <channel id="app_url" typeId="CameraEventAppUrl"/>
+ <channel id="image_url" typeId="CameraEventImageUrl"/>
+ <channel id="web_url" typeId="CameraEventWebUrl"/>
+ <channel id="activity_zones" typeId="CameraEventActivityZones"/>
+ </channels>
+ </channel-group-type>
+
+ <channel-type id="CameraEventHasSound" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Has Sound</label>
+ <description>If sound was detected in the camera event</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventHasMotion" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Has Motion</label>
+ <description>If motion was detected in the camera event</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventHasPerson" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Has Person</label>
+ <description>If a person was detected in the camera event</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventStartTime" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Start Time</label>
+ <description>Timestamp when the camera event started</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventEndTime" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>End Time</label>
+ <description>Timestamp when the camera event ended</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventUrlsExpireTime" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>URLs Expire Time</label>
+ <description>Timestamp when the camera event URLs expire</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventWebUrl" advanced="true">
+ <item-type>String</item-type>
+ <label>Web URL</label>
+ <description>The web URL for the camera event, allows you to see the camera event in a web page</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventAppUrl" advanced="true">
+ <item-type>String</item-type>
+ <label>App URL</label>
+ <description>The app URL for the camera event, allows you to see the camera event in an app</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventImageUrl" advanced="true">
+ <item-type>String</item-type>
+ <label>Image URL</label>
+ <description>The URL showing an image for the camera event</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventAnimatedImageUrl" advanced="true">
+ <item-type>String</item-type>
+ <label>Animated Image URL</label>
+ <description>The URL showing an animated image for the camera event</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CameraEventActivityZones" advanced="true">
+ <item-type>String</item-type>
+ <label>Activity Zones</label>
+ <description>Identifiers for activity zones that detected the event (comma separated)</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <!-- Smoke detector -->
+ <channel-type id="UiColorState" advanced="true">
+ <item-type>String</item-type>
+ <label>UI Color State</label>
+ <description>Current color state of the protect</description>
+ <state readOnly="true">
+ <options>
+ <option value="GRAY">gray</option>
+ <option value="GREEN">green</option>
+ <option value="YELLOW">yellow</option>
+ <option value="RED">red</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="CoAlarmState">
+ <item-type>String</item-type>
+ <label>CO Alarm State</label>
+ <description>Carbon monoxide alarm state</description>
+ <state readOnly="true">
+ <options>
+ <option value="OK">ok</option>
+ <option value="EMERGENCY">emergency</option>
+ <option value="WARNING">warning</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="SmokeAlarmState">
+ <item-type>String</item-type>
+ <label>Smoke Alarm State</label>
+ <description>Smoke alarm state</description>
+ <state readOnly="true">
+ <options>
+ <option value="OK">ok</option>
+ <option value="EMERGENCY">emergency</option>
+ <option value="WARNING">warning</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="ManualTestActive" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Manual Test Active</label>
+ <description>If the manual test is currently active</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="LastManualTestTime" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Last Manual Test Time</label>
+ <description>Timestamp of the last successful manual test</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <!-- Thermostat -->
+ <channel-type id="Temperature">
+ <item-type>Number:Temperature</item-type>
+ <label>Temperature</label>
+ <description>Current temperature</description>
+ <category>Temperature</category>
+ <state readOnly="true" pattern="%.1f %unit%"/>
+ </channel-type>
+
+ <channel-type id="SetPoint">
+ <item-type>Number:Temperature</item-type>
+ <label>Set Point</label>
+ <description>The set point temperature</description>
+ <category>Temperature</category>
+ <state pattern="%.1f %unit%" step="0.5"/>
+ </channel-type>
+
+ <channel-type id="MaxSetPoint">
+ <item-type>Number:Temperature</item-type>
+ <label>Max Set Point</label>
+ <description>The max set point temperature</description>
+ <category>Temperature</category>
+ <state pattern="%.1f %unit%" step="0.5"/>
+ </channel-type>
+
+ <channel-type id="MinSetPoint">
+ <item-type>Number:Temperature</item-type>
+ <label>Min Set Point</label>
+ <description>The min set point temperature</description>
+ <category>Temperature</category>
+ <state pattern="%.1f %unit%" step="0.5"/>
+ </channel-type>
+
+ <channel-type id="EcoMaxSetPoint" advanced="true">
+ <item-type>Number:Temperature</item-type>
+ <label>Eco Max Set Point</label>
+ <description>The eco range max set point temperature</description>
+ <category>Temperature</category>
+ <state readOnly="true" pattern="%.1f %unit%"/>
+ </channel-type>
+
+ <channel-type id="EcoMinSetPoint" advanced="true">
+ <item-type>Number:Temperature</item-type>
+ <label>Eco Min Set Point</label>
+ <description>The eco range min set point temperature</description>
+ <category>Temperature</category>
+ <state readOnly="true" pattern="%.1f %unit%"/>
+ </channel-type>
+
+ <channel-type id="LockedMaxSetPoint" advanced="true">
+ <item-type>Number:Temperature</item-type>
+ <label>Locked Max Set Point</label>
+ <description>The locked range max set point temperature</description>
+ <category>Temperature</category>
+ <state readOnly="true" pattern="%.1f %unit%"/>
+ </channel-type>
+
+ <channel-type id="LockedMinSetPoint" advanced="true">
+ <item-type>Number:Temperature</item-type>
+ <label>Locked Min Set Point</label>
+ <description>The locked range min set point temperature</description>
+ <category>Temperature</category>
+ <state readOnly="true" pattern="%.1f %unit%"/>
+ </channel-type>
+
+ <channel-type id="Locked" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Locked</label>
+ <description>If the thermostat has the temperature locked to only be within a set range</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="Mode">
+ <item-type>String</item-type>
+ <label>Mode</label>
+ <description>Current mode of the Nest thermostat</description>
+ <state>
+ <options>
+ <option value="OFF">off</option>
+ <option value="ECO">eco</option>
+ <option value="HEAT">heating</option>
+ <option value="COOL">cooling</option>
+ <option value="HEAT_COOL">heat/cool</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="PreviousMode" advanced="true">
+ <item-type>String</item-type>
+ <label>Previous Mode</label>
+ <description>The previous mode of the Nest thermostat</description>
+ <state readOnly="true">
+ <options>
+ <option value="OFF">off</option>
+ <option value="ECO">eco</option>
+ <option value="HEAT">heating</option>
+ <option value="COOL">cooling</option>
+ <option value="HEAT_COOL">heat/cool</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="State" advanced="true">
+ <item-type>String</item-type>
+ <label>State</label>
+ <description>The active state of the Nest thermostat</description>
+ <state readOnly="true">
+ <options>
+ <option value="OFF">off</option>
+ <option value="HEATING">heating</option>
+ <option value="COOLING">cooling</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="Humidity">
+ <item-type>Number:Dimensionless</item-type>
+ <label>Humidity</label>
+ <description>Indicates the current relative humidity</description>
+ <category>Humidity</category>
+ <state pattern="%.1f %unit%" readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="TimeToTarget">
+ <item-type>Number:Time</item-type>
+ <label>Time to Target</label>
+ <description>Time left to the target temperature approximately</description>
+ <state pattern="%d %unit%" readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CanHeat" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Can Heat</label>
+ <description>If the thermostat can actually turn on heating</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="CanCool" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Can Cool</label>
+ <description>If the thermostat can actually turn on cooling</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="FanTimerActive" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Fan Timer Active</label>
+ <description>If the fan timer is engaged</description>
+ <state/>
+ </channel-type>
+
+ <channel-type id="FanTimerDuration" advanced="true">
+ <item-type>Number:Time</item-type>
+ <label>Fan Timer Duration</label>
+ <description>Length of time that the fan is set to run</description>
+ <state>
+ <options>
+ <option value="15">15 min</option>
+ <option value="30">30 min</option>
+ <option value="45">45 min</option>
+ <option value="60">1 h</option>
+ <option value="120">2 h</option>
+ <option value="240">4 h</option>
+ <option value="480">8 h</option>
+ <option value="960">16 h</option>
+ </options>
+ </state>
+ </channel-type>
+
+ <channel-type id="FanTimerTimeout" advanced="true">
+ <item-type>DateTime</item-type>
+ <label>Fan Timer Timeout</label>
+ <description>Timestamp when the fan stops running</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="HasFan" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Has Fan</label>
+ <description>If the thermostat can control the fan</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="HasLeaf" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Has Leaf</label>
+ <description>If the thermostat is currently in a leaf mode</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="SunlightCorrectionEnabled" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Sunlight Correction Enabled</label>
+ <description>If sunlight correction is enabled</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="SunlightCorrectionActive" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Sunlight Correction Active</label>
+ <description>If sunlight correction is active</description>
+ <state readOnly="true"/>
+ </channel-type>
+
+ <channel-type id="UsingEmergencyHeat" advanced="true">
+ <item-type>Switch</item-type>
+ <label>Using Emergency Heat</label>
+ <description>If the system is currently using emergency heat</description>
+ <state readOnly="true"/>
+ </channel-type>
+</thing:thing-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <thing-type id="smoke_detector">
+ <supported-bridge-type-refs>
+ <bridge-type-ref id="account"/>
+ </supported-bridge-type-refs>
+
+ <label>Nest Protect</label>
+ <description>The smoke detector/Nest Protect for the account</description>
+
+ <channels>
+ <channel id="ui_color_state" typeId="UiColorState"/>
+ <channel id="low_battery" typeId="system.low-battery"/>
+ <channel id="co_alarm_state" typeId="CoAlarmState"/>
+ <channel id="smoke_alarm_state" typeId="SmokeAlarmState"/>
+ <channel id="manual_test_active" typeId="ManualTestActive"/>
+ <channel id="last_manual_test_time" typeId="LastManualTestTime"/>
+ <channel id="last_connection" typeId="LastConnection"/>
+ </channels>
+
+ <properties>
+ <property name="vendor">Nest</property>
+ </properties>
+
+ <representation-property>deviceId</representation-property>
+
+ <config-description-ref uri="thing-type:nest:device"/>
+ </thing-type>
+</thing:thing-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <thing-type id="structure">
+ <supported-bridge-type-refs>
+ <bridge-type-ref id="account"/>
+ </supported-bridge-type-refs>
+
+ <label>Nest Structure</label>
+ <description>The Nest structure defines the house the account has setup on Nest.
+ You will only have more than one
+ structure if you have more than one house</description>
+
+ <channels>
+ <channel id="country_code" typeId="CountryCode"/>
+ <channel id="postal_code" typeId="PostalCode"/>
+ <channel id="time_zone" typeId="TimeZone"/>
+ <channel id="peak_period_start_time" typeId="PeakPeriodStartTime"/>
+ <channel id="peak_period_end_time" typeId="PeakPeriodEndTime"/>
+ <channel id="rush_hour_rewards_enrollment" typeId="RushHourRewardsEnrollment"/>
+ <channel id="eta_begin" typeId="EtaBegin"/>
+ <channel id="co_alarm_state" typeId="CoAlarmState"/>
+ <channel id="smoke_alarm_state" typeId="SmokeAlarmState"/>
+ <channel id="security_state" typeId="SecurityState"/>
+ <channel id="away" typeId="Away"/>
+ </channels>
+
+ <properties>
+ <property name="vendor">Nest</property>
+ </properties>
+
+ <representation-property>structureId</representation-property>
+
+ <config-description-ref uri="thing-type:nest:structure"/>
+ </thing-type>
+
+</thing:thing-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <thing-type id="thermostat">
+ <supported-bridge-type-refs>
+ <bridge-type-ref id="account"/>
+ </supported-bridge-type-refs>
+
+ <label>Nest Thermostat</label>
+ <description>A Thermostat to control the various aspects of the house's HVAC system</description>
+
+ <channels>
+ <channel id="temperature" typeId="Temperature"/>
+ <channel id="humidity" typeId="Humidity"/>
+ <channel id="mode" typeId="Mode"/>
+ <channel id="previous_mode" typeId="PreviousMode"/>
+ <channel id="state" typeId="State"/>
+ <channel id="set_point" typeId="SetPoint"/>
+ <channel id="max_set_point" typeId="MaxSetPoint"/>
+ <channel id="min_set_point" typeId="MinSetPoint"/>
+ <channel id="can_heat" typeId="CanHeat"/>
+ <channel id="can_cool" typeId="CanCool"/>
+ <channel id="fan_timer_active" typeId="FanTimerActive"/>
+ <channel id="fan_timer_duration" typeId="FanTimerDuration"/>
+ <channel id="fan_timer_timeout" typeId="FanTimerTimeout"/>
+ <channel id="has_fan" typeId="HasFan"/>
+ <channel id="has_leaf" typeId="HasLeaf"/>
+ <channel id="sunlight_correction_enabled" typeId="SunlightCorrectionEnabled"/>
+ <channel id="sunlight_correction_active" typeId="SunlightCorrectionActive"/>
+ <channel id="using_emergency_heat" typeId="UsingEmergencyHeat"/>
+ <channel id="eco_max_set_point" typeId="EcoMaxSetPoint"/>
+ <channel id="eco_min_set_point" typeId="EcoMinSetPoint"/>
+ <channel id="locked" typeId="Locked"/>
+ <channel id="locked_max_set_point" typeId="LockedMaxSetPoint"/>
+ <channel id="locked_min_set_point" typeId="LockedMinSetPoint"/>
+ <channel id="time_to_target" typeId="TimeToTarget"/>
+ <channel id="last_connection" typeId="LastConnection"/>
+ </channels>
+
+ <properties>
+ <property name="vendor">Nest</property>
+ </properties>
+
+ <representation-property>deviceId</representation-property>
+
+ <config-description-ref uri="thing-type:nest:device"/>
+ </thing-type>
+</thing:thing-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="test" value="true"/>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.dynamodb</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# Amazon DynamoDB Persistence
+
+This service allows you to persist state updates using the [Amazon DynamoDB](https://aws.amazon.com/dynamodb/) database.
+Query functionality is also fully supported.
+
+Features:
+
+* Writing/reading information to relational database systems
+* Configurable database table names
+* Automatic table creation
+
+## Disclaimer
+
+This service is provided "AS IS", and the user takes full responsibility of any charges or damage to Amazon data.
+
+## Table of Contents
+
+<!-- Using MarkdownTOC plugin for Sublime Text to update the table of contents (TOC) -->
+<!-- MarkdownTOC depth=3 autolink=true bracket=round -->
+
+- [Prerequisites](#prerequisites)
+ - [Setting Up an Amazon Account](#setting-up-an-amazon-account)
+- [Configuration](#configuration)
+ - [Basic configuration](#basic-configuration)
+ - [Configuration Using Credentials File](#configuration-using-credentials-file)
+ - [Advanced Configuration](#advanced-configuration)
+- [Details](#details)
+ - [Tables Creation](#tables-creation)
+ - [Caveats](#caveats)
+- [Developer Notes](#developer-notes)
+ - [Updating Amazon SDK](#updating-amazon-sdk)
+
+<!-- /MarkdownTOC -->
+
+## Prerequisites
+
+You must first set up an Amazon account as described below.
+
+Users are recommended to familiarize themselves with AWS pricing before using this service.
+Please note that there might be charges from Amazon when using this service to query/store data to DynamoDB.
+See [Amazon DynamoDB pricing pages](https://aws.amazon.com/dynamodb/pricing/) for more details.
+Please also note possible [Free Tier](https://aws.amazon.com/free/) benefits.
+
+### Setting Up an Amazon Account
+
+* [Sign up](https://aws.amazon.com/) for Amazon AWS.
+* Select the AWS region in the [AWS console](https://console.aws.amazon.com/) using [these instructions](https://docs.aws.amazon.com/awsconsolehelpdocs/latest/gsg/getting-started.html#select-region). Note the region identifier in the URL (e.g. `https://eu-west-1.console.aws.amazon.com/console/home?region=eu-west-1` means that region id is `eu-west-1`).
+* **Create user for openHAB with IAM**
+ * Open Services -> IAM -> Users -> Create new Users. Enter `openhab` to _User names_, keep _Generate an access key for each user_ checked, and finally click _Create_.
+ * _Show User Security Credentials_ and record the keys displayed
+* **Configure user policy to have access for dynamodb**
+ * Open Services -> IAM -> Policies
+ * Check _AmazonDynamoDBFullAccess_ and click _Policy actions_ -> _Attach_
+ * Check the user created in step 2 and click _Attach policy_
+
+## Configuration
+
+This service can be configured in the file `services/dynamodb.cfg`.
+
+### Basic configuration
+
+| Property | Default | Required | Description |
+| --------- | ------- | :------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| accessKey | | Yes | access key as shown in [Setting up Amazon account](#setting-up-an-amazon-account). |
+| secretKey | | Yes | secret key as shown in [Setting up Amazon account](#setting-up-an-amazon-account). |
+| region | | Yes | AWS region ID as described in [Setting up Amazon account](#setting-up-an-amazon-account). The region needs to match the region that was used to create the user. |
+
+### Configuration Using Credentials File
+
+Alternatively, instead of specifying `accessKey` and `secretKey`, one can configure a configuration profile file.
+
+| Property | Default | Required | Description |
+| ------------------ | ------- | :------: | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| profilesConfigFile | | Yes | path to the credentials file. For example, `/etc/openhab2/aws_creds`. Please note that the user that runs openHAB must have approriate read rights to the credential file. For more details on the Amazon credential file format, see [Amazon documentation](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html). |
+| profile | | Yes | name of the profile to use |
+| region | | Yes | AWS region ID as described in Step 2 in [Setting up Amazon account](#setting-up-an-amazon-account). The region needs to match the region that was used to create the user. |
+
+Example of service configuration file (`services/dynamodb.cfg`):
+
+```ini
+profilesConfigFile=/etc/openhab2/aws_creds
+profile=fooprofile
+region=eu-west-1
+```
+
+Example of credentials file (`/etc/openhab2/aws_creds`):
+
+````ini
+[fooprofile]
+aws_access_key_id=testAccessKey
+aws_secret_access_key=testSecretKey
+````
+
+### Advanced Configuration
+
+In addition to the configuration properties above, the following are also available:
+
+| Property | Default | Required | Description |
+| -------------------------- | ---------- | :------: | -------------------------------------------------------------------------------------------------- |
+| readCapacityUnits | 1 | No | read capacity for the created tables |
+| writeCapacityUnits | 1 | No | write capacity for the created tables |
+| tablePrefix | `openhab-` | No | table prefix used in the name of created tables |
+| bufferCommitIntervalMillis | 1000 | No | Interval to commit (write) buffered data. In milliseconds. |
+| bufferSize | 1000 | No | Internal buffer size in datapoints which is used to batch writes to DynamoDB every `bufferCommitIntervalMillis`. |
+
+Typically you should not need to modify parameters related to buffering.
+
+Refer to Amazon documentation on [provisioned throughput](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ProvisionedThroughput.html) for details on read/write capacity.
+
+All item- and event-related configuration is done in the file `persistence/dynamodb.persist`.
+
+## Details
+
+### Tables Creation
+
+When an item is persisted via this service, a table is created (if necessary).
+Currently, the service will create at most two tables for different item types.
+The tables will be named `<tablePrefix><item-type>`, where the `<item-type>` is either `bigdecimal` (numeric items) or `string` (string and complex items).
+
+Each table will have three columns: `itemname` (item name), `timeutc` (in ISO 8601 format with millisecond accuracy), and `itemstate` (either a number or string representing item state).
+
+## Buffering
+
+By default, the service is asynchronous which means that data is not written immediately to DynamoDB but instead buffered in-memory.
+The size of the buffer, in terms of datapoints, can be configured with `bufferSize`.
+Every `bufferCommitIntervalMillis` the whole buffer of data is flushed to DynamoDB.
+
+It is recommended to have the buffering enabled since the synchronous behaviour (writing data immediately) might have adverse impact to the whole system when there is many items persisted at the same time.
+The buffering can be disabled by setting `bufferSize` to zero.
+
+The defaults should be suitable in many use cases.
+
+### Caveats
+
+When the tables are created, the read/write capacity is configured according to configuration.
+However, the service does not modify the capacity of existing tables.
+As a workaround, you can modify the read/write capacity of existing tables using the [Amazon console](https://aws.amazon.com/console/).
+
+## Developer Notes
+
+### Updating Amazon SDK
+
+1. Clean `lib/*`
+2. Update SDK version in `scripts/fetch_sdk_pom.xml`. You can use the [maven online repository browser](https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-dynamodb) to find the latest version available online.
+3. `scripts/fetch_sdk.sh`
+4. Copy `scripts/target/site/dependencies.html` and `scripts/target/dependency/*.jar` to `lib/`
+5. Generate `build.properties` entries
+ `ls lib/*.jar | python -c "import sys; print(' ' + ',\\\\\\n '.join(map(str.strip, sys.stdin.readlines())))"`
+6. Generate `META-INF/MANIFEST.MF` `Bundle-ClassPath` entries
+ `ls lib/*.jar | python -c "import sys; print(' ' + ',\\n '.join(map(str.strip, sys.stdin.readlines())))"`
+7. Generate `.classpath` entries
+ `ls lib/*.jar | python -c "import sys;pre='<classpathentry exported=\"true\" kind=\"lib\" path=\"';post='\"/>'; print('\\t' + pre + (post + '\\n\\t' + pre).join(map(str.strip, sys.stdin.readlines())) + post)"`
+
+After these changes, it's good practice to run integration tests (against live AWS DynamoDB) in `org.openhab.persistence.dynamodb.test` bundle.
+See README.md in the test bundle for more information how to execute the tests.
+
+### Running integration tests
+
+To run integration tests, one needs to provide AWS credentials.
+
+Eclipse instructions
+
+1. Run all tests (in package org.openhab.persistence.dynamodb.internal) as JUnit Tests
+2. Configure the run configuration, and open Arguments sheet
+3. In VM arguments, provide the credentials for AWS
+
+````
+-DDYNAMODBTEST_REGION=REGION-ID
+-DDYNAMODBTEST_ACCESS=ACCESS-KEY
+-DDYNAMODBTEST_SECRET=SECRET
+````
+
+The tests will create tables with prefix `dynamodb-integration-tests-`.
+Note that when tests are begun, all data is removed from that table!
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.dynamodb</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: DynamoDB</name>
+
+ <properties>
+ <bnd.importpackage>!com.amazonaws.*,!org.joda.convert.*,!com.sun.org.apache.xpath.*,!kotlin,!org.apache.log.*,!org.bouncycastle.*,!org.apache.avalon.*</bnd.importpackage>
+ </properties>
+
+ <dependencies>
+ <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-core -->
+ <dependency>
+ <groupId>com.amazonaws</groupId>
+ <artifactId>aws-java-sdk-core</artifactId>
+ <version>1.11.213</version>
+ </dependency>
+ <dependency>
+ <groupId>com.amazonaws</groupId>
+ <artifactId>aws-java-sdk-dynamodb</artifactId>
+ <version>1.11.213</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-kms -->
+ <dependency>
+ <groupId>com.amazonaws</groupId>
+ <artifactId>aws-java-sdk-kms</artifactId>
+ <version>1.11.213</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-s3 -->
+ <dependency>
+ <groupId>com.amazonaws</groupId>
+ <artifactId>aws-java-sdk-s3</artifactId>
+ <version>1.11.213</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/com.amazonaws/jmespath-java -->
+ <dependency>
+ <groupId>com.amazonaws</groupId>
+ <artifactId>jmespath-java</artifactId>
+ <version>1.11.213</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient -->
+ <dependency>
+ <groupId>org.apache.httpcomponents</groupId>
+ <artifactId>httpclient</artifactId>
+ <version>4.5.2</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/software.amazon.ion/ion-java -->
+ <dependency>
+ <groupId>software.amazon.ion</groupId>
+ <artifactId>ion-java</artifactId>
+ <version>1.0.2</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpcore -->
+ <dependency>
+ <groupId>org.apache.httpcomponents</groupId>
+ <artifactId>httpcore</artifactId>
+ <version>4.4.4</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/commons-logging/commons-logging -->
+ <dependency>
+ <groupId>commons-logging</groupId>
+ <artifactId>commons-logging</artifactId>
+ <version>1.1.3</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/commons-codec/commons-codec -->
+ <dependency>
+ <groupId>commons-codec</groupId>
+ <artifactId>commons-codec</artifactId>
+ <version>1.9</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/joda-time/joda-time -->
+ <dependency>
+ <groupId>joda-time</groupId>
+ <artifactId>joda-time</artifactId>
+ <version>2.8.1</version>
+ </dependency>
+
+ <!-- The following dependencies are required for test resolution -->
+
+ <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-annotations -->
+ <dependency>
+ <groupId>com.fasterxml.jackson.core</groupId>
+ <artifactId>jackson-annotations</artifactId>
+ <version>2.6.0</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core -->
+ <dependency>
+ <groupId>com.fasterxml.jackson.core</groupId>
+ <artifactId>jackson-core</artifactId>
+ <version>2.6.7</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
+ <dependency>
+ <groupId>com.fasterxml.jackson.core</groupId>
+ <artifactId>jackson-databind</artifactId>
+ <version>2.6.7.1</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.dataformat/jackson-dataformat-cbor -->
+ <dependency>
+ <groupId>com.fasterxml.jackson.dataformat</groupId>
+ <artifactId>jackson-dataformat-cbor</artifactId>
+ <version>2.6.7</version>
+ </dependency>
+ </dependencies>
+</project>
--- /dev/null
+#!/usr/bin/env bash
+DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+mvn -f $DIR/fetch_sdk_pom.xml clean process-sources project-info-reports:dependencies
+
+echo "Check $DIR/target/site/dependencies.html and $DIR/target/dependency"
\ No newline at end of file
--- /dev/null
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+ <groupId>groupId</groupId>
+ <artifactId>artifactId</artifactId>
+ <version>1.0</version>
+
+ <dependencies>
+ <dependency>
+ <groupId>com.amazonaws</groupId>
+ <artifactId>aws-java-sdk-dynamodb</artifactId>
+ <version>1.11.213</version>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <artifactId>maven-dependency-plugin</artifactId>
+ <executions>
+ <execution>
+ <phase>process-sources</phase>
+
+ <goals>
+ <goal>copy-dependencies</goal>
+ </goals>
+
+ <configuration>
+ <outputDirectory>${targetdirectory}</outputDirectory>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+</project>
\ No newline at end of file
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.dynamodb-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-persistence-dynamodb" description="DynamoDB Persistence" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.dynamodb/${project.version}</bundle>
+ <configfile finalname="${openhab.conf}/services/dynamodb.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/dynamodb</configfile>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.Instant;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.UUID;
+import java.util.concurrent.ArrayBlockingQueue;
+import java.util.concurrent.BlockingQueue;
+import java.util.concurrent.TimeUnit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Abstract class for buffered persistence services
+ *
+ * @param <T> Type of the state as accepted by the AWS SDK.
+ *
+ * @author Sami Salonen - Initial contribution
+ * @author Kai Kreuzer - Migration to 3.x
+ *
+ */
+@NonNullByDefault
+public abstract class AbstractBufferedPersistenceService<T> implements PersistenceService {
+
+ private static final long BUFFER_OFFER_TIMEOUT_MILLIS = 500;
+
+ private final Logger logger = LoggerFactory.getLogger(AbstractBufferedPersistenceService.class);
+ protected @Nullable BlockingQueue<T> buffer;
+
+ private boolean writeImmediately;
+
+ protected void resetWithBufferSize(int bufferSize) {
+ int capacity = Math.max(1, bufferSize);
+ buffer = new ArrayBlockingQueue<>(capacity, true);
+ writeImmediately = bufferSize == 0;
+ }
+
+ protected abstract T persistenceItemFromState(String name, State state, ZonedDateTime time);
+
+ protected abstract boolean isReadyToStore();
+
+ protected abstract void flushBufferedData();
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ long storeStart = System.currentTimeMillis();
+ String uuid = UUID.randomUUID().toString();
+ if (item.getState() instanceof UnDefType) {
+ logger.debug("Undefined item state received. Not storing item {}.", item.getName());
+ return;
+ }
+ if (!isReadyToStore()) {
+ return;
+ }
+ if (buffer == null) {
+ throw new IllegalStateException("Buffer not initialized with resetWithBufferSize. Bug?");
+ }
+ ZonedDateTime time = ZonedDateTime.ofInstant(Instant.ofEpochMilli(storeStart), ZoneId.systemDefault());
+ String realName = item.getName();
+ String name = (alias != null) ? alias : realName;
+ State state = item.getState();
+ T persistenceItem = persistenceItemFromState(name, state, time);
+ logger.trace("store() called with item {}, which was converted to {} [{}]", item, persistenceItem, uuid);
+ if (writeImmediately) {
+ logger.debug("Writing immediately item {} [{}]", realName, uuid);
+ // We want to write everything immediately
+ // Synchronous behavior to ensure buffer does not get full.
+ synchronized (this) {
+ boolean buffered = addToBuffer(persistenceItem);
+ assert buffered;
+ flushBufferedData();
+ }
+ } else {
+ long bufferStart = System.currentTimeMillis();
+ boolean buffered = addToBuffer(persistenceItem);
+ if (buffered) {
+ logger.debug("Buffered item {} in {} ms. Total time for store(): {} [{}]", realName,
+ System.currentTimeMillis() - bufferStart, System.currentTimeMillis() - storeStart, uuid);
+ } else {
+ logger.debug(
+ "Buffer is full. Writing buffered data immediately and trying again. Consider increasing bufferSize");
+ // Buffer is full, commit it immediately
+ flushBufferedData();
+ boolean buffered2 = addToBuffer(persistenceItem);
+ if (buffered2) {
+ logger.debug("Buffered item in {} ms (2nd try, flushed buffer in-between) [{}]",
+ System.currentTimeMillis() - bufferStart, uuid);
+ } else {
+ // The unlikely case happened -- buffer got full again immediately
+ logger.warn("Buffering failed for the second time -- Too small bufferSize? Discarding data [{}]",
+ uuid);
+ }
+ }
+ }
+ }
+
+ protected boolean addToBuffer(T persistenceItem) {
+ try {
+ return buffer != null && buffer.offer(persistenceItem, BUFFER_OFFER_TIMEOUT_MILLIS, TimeUnit.MILLISECONDS);
+ } catch (InterruptedException e) {
+ logger.warn("Interrupted when trying to buffer data! Dropping data");
+ return false;
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.math.BigDecimal;
+import java.text.DateFormat;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.time.format.DateTimeFormatter;
+import java.time.format.DateTimeParseException;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.PlayerItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PlayPauseType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.RewindFastforwardType;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.library.types.UpDownType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Base class for all DynamoDBItem. Represents openHAB Item serialized in a suitable format for the database
+ *
+ * @param <T> Type of the state as accepted by the AWS SDK.
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public abstract class AbstractDynamoDBItem<T> implements DynamoDBItem<T> {
+
+ public static final DateTimeFormatter DATEFORMATTER = DateTimeFormatter.ofPattern(DATE_FORMAT)
+ .withZone(ZoneId.of("UTC"));
+
+ private static final String UNDEFINED_PLACEHOLDER = "<org.openhab.core.types.UnDefType.UNDEF>";
+
+ private static final Map<Class<? extends Item>, Class<? extends DynamoDBItem<?>>> ITEM_CLASS_MAP = new HashMap<>();
+
+ static {
+ ITEM_CLASS_MAP.put(CallItem.class, DynamoDBStringItem.class);
+ ITEM_CLASS_MAP.put(ContactItem.class, DynamoDBBigDecimalItem.class);
+ ITEM_CLASS_MAP.put(DateTimeItem.class, DynamoDBStringItem.class);
+ ITEM_CLASS_MAP.put(LocationItem.class, DynamoDBStringItem.class);
+ ITEM_CLASS_MAP.put(NumberItem.class, DynamoDBBigDecimalItem.class);
+ ITEM_CLASS_MAP.put(RollershutterItem.class, DynamoDBBigDecimalItem.class);
+ ITEM_CLASS_MAP.put(StringItem.class, DynamoDBStringItem.class);
+ ITEM_CLASS_MAP.put(SwitchItem.class, DynamoDBBigDecimalItem.class);
+ ITEM_CLASS_MAP.put(DimmerItem.class, DynamoDBBigDecimalItem.class); // inherited from SwitchItem (!)
+ ITEM_CLASS_MAP.put(ColorItem.class, DynamoDBStringItem.class); // inherited from DimmerItem
+ ITEM_CLASS_MAP.put(PlayerItem.class, DynamoDBStringItem.class);
+ }
+
+ public static final Class<DynamoDBItem<?>> getDynamoItemClass(Class<? extends Item> itemClass)
+ throws NullPointerException {
+ @SuppressWarnings("unchecked")
+ Class<DynamoDBItem<?>> dtoclass = (Class<DynamoDBItem<?>>) ITEM_CLASS_MAP.get(itemClass);
+ if (dtoclass == null) {
+ throw new IllegalArgumentException(String.format("Unknown item class %s", itemClass));
+ }
+ return dtoclass;
+ }
+
+ private final Logger logger = LoggerFactory.getLogger(AbstractDynamoDBItem.class);
+
+ protected String name;
+ protected T state;
+ protected ZonedDateTime time;
+
+ public AbstractDynamoDBItem(String name, T state, ZonedDateTime time) {
+ this.name = name;
+ this.state = state;
+ this.time = time;
+ }
+
+ public static DynamoDBItem<?> fromState(String name, State state, ZonedDateTime time) {
+ if (state instanceof DecimalType && !(state instanceof HSBType)) {
+ // also covers PercentType which is inherited from DecimalType
+ return new DynamoDBBigDecimalItem(name, ((DecimalType) state).toBigDecimal(), time);
+ } else if (state instanceof OnOffType) {
+ return new DynamoDBBigDecimalItem(name,
+ ((OnOffType) state) == OnOffType.ON ? BigDecimal.ONE : BigDecimal.ZERO, time);
+ } else if (state instanceof OpenClosedType) {
+ return new DynamoDBBigDecimalItem(name,
+ ((OpenClosedType) state) == OpenClosedType.OPEN ? BigDecimal.ONE : BigDecimal.ZERO, time);
+ } else if (state instanceof UpDownType) {
+ return new DynamoDBBigDecimalItem(name,
+ ((UpDownType) state) == UpDownType.UP ? BigDecimal.ONE : BigDecimal.ZERO, time);
+ } else if (state instanceof DateTimeType) {
+ return new DynamoDBStringItem(name, ((DateTimeType) state).getZonedDateTime().format(DATEFORMATTER), time);
+ } else if (state instanceof UnDefType) {
+ return new DynamoDBStringItem(name, UNDEFINED_PLACEHOLDER, time);
+ } else if (state instanceof StringListType) {
+ return new DynamoDBStringItem(name, state.toFullString(), time);
+ } else {
+ // HSBType, PointType, PlayPauseType and StringType
+ return new DynamoDBStringItem(name, state.toFullString(), time);
+ }
+ }
+
+ @Override
+ public HistoricItem asHistoricItem(final Item item) {
+ final State[] state = new State[1];
+ accept(new DynamoDBItemVisitor() {
+
+ @Override
+ public void visit(DynamoDBStringItem dynamoStringItem) {
+ if (item instanceof ColorItem) {
+ state[0] = new HSBType(dynamoStringItem.getState());
+ } else if (item instanceof LocationItem) {
+ state[0] = new PointType(dynamoStringItem.getState());
+ } else if (item instanceof PlayerItem) {
+ String value = dynamoStringItem.getState();
+ try {
+ state[0] = PlayPauseType.valueOf(value);
+ } catch (IllegalArgumentException e) {
+ state[0] = RewindFastforwardType.valueOf(value);
+ }
+ } else if (item instanceof DateTimeItem) {
+ try {
+ // Parse ZoneDateTime from string. DATEFORMATTER assumes UTC in case it is not clear
+ // from the string (should be).
+ // We convert to default/local timezone for user convenience (e.g. display)
+ state[0] = new DateTimeType(ZonedDateTime.parse(dynamoStringItem.getState(), DATEFORMATTER)
+ .withZoneSameInstant(ZoneId.systemDefault()));
+ } catch (DateTimeParseException e) {
+ logger.warn("Failed to parse {} as date. Outputting UNDEF instead",
+ dynamoStringItem.getState());
+ state[0] = UnDefType.UNDEF;
+ }
+ } else if (dynamoStringItem.getState().equals(UNDEFINED_PLACEHOLDER)) {
+ state[0] = UnDefType.UNDEF;
+ } else if (item instanceof CallItem) {
+ String parts = dynamoStringItem.getState();
+ String[] strings = parts.split(",");
+ String orig = strings[0];
+ String dest = strings[1];
+ state[0] = new StringListType(orig, dest);
+ } else {
+ state[0] = new StringType(dynamoStringItem.getState());
+ }
+ }
+
+ @Override
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
+ if (item instanceof NumberItem) {
+ state[0] = new DecimalType(dynamoBigDecimalItem.getState());
+ } else if (item instanceof DimmerItem) {
+ state[0] = new PercentType(dynamoBigDecimalItem.getState());
+ } else if (item instanceof SwitchItem) {
+ state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OnOffType.ON
+ : OnOffType.OFF;
+ } else if (item instanceof ContactItem) {
+ state[0] = dynamoBigDecimalItem.getState().compareTo(BigDecimal.ONE) == 0 ? OpenClosedType.OPEN
+ : OpenClosedType.CLOSED;
+ } else if (item instanceof RollershutterItem) {
+ state[0] = new PercentType(dynamoBigDecimalItem.getState());
+ } else {
+ logger.warn("Not sure how to convert big decimal item {} to type {}. Using StringType as fallback",
+ dynamoBigDecimalItem.getName(), item.getClass());
+ state[0] = new StringType(dynamoBigDecimalItem.getState().toString());
+ }
+ }
+ });
+ return new DynamoDBHistoricItem(getName(), state[0], getTime());
+ }
+
+ /**
+ * We define all getter and setters in the child class implement those. Having the getter
+ * and setter implementations here in the parent class does not work with introspection done by AWS SDK (1.11.56).
+ */
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see org.openhab.persistence.dynamodb.internal.DynamoItem#accept(org.openhab.persistence.dynamodb.internal.
+ * DynamoItemVisitor)
+ */
+ @Override
+ public abstract void accept(DynamoDBItemVisitor visitor);
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(time) + ": " + name + " -> " + state.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.math.BigDecimal;
+import java.math.MathContext;
+import java.time.ZonedDateTime;
+
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
+
+/**
+ * DynamoDBItem for items that can be serialized as DynamoDB number
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@DynamoDBDocument
+public class DynamoDBBigDecimalItem extends AbstractDynamoDBItem<BigDecimal> {
+
+ /**
+ * We get the following error if the BigDecimal has too many digits
+ * "Attempting to store more than 38 significant digits in a Number"
+ *
+ * See "Data types" section in
+ * http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Limits.html
+ */
+ private static final int MAX_DIGITS_SUPPORTED_BY_AMAZON = 38;
+
+ public DynamoDBBigDecimalItem() {
+ this(null, null, null);
+ }
+
+ public DynamoDBBigDecimalItem(String name, BigDecimal state, ZonedDateTime time) {
+ super(name, state, time);
+ }
+
+ @DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
+ @Override
+ public BigDecimal getState() {
+ // When serializing this to the wire, we round the number in order to ensure
+ // that it is within the dynamodb limits
+ return loseDigits(state);
+ }
+
+ @DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ @DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
+ public ZonedDateTime getTime() {
+ return time;
+ }
+
+ @Override
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public void setState(BigDecimal state) {
+ this.state = state;
+ }
+
+ @Override
+ public void setTime(ZonedDateTime time) {
+ this.time = time;
+ }
+
+ @Override
+ public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
+ visitor.visit(this);
+ }
+
+ static BigDecimal loseDigits(BigDecimal number) {
+ if (number == null) {
+ return null;
+ }
+ return number.round(new MathContext(MAX_DIGITS_SUPPORTED_BY_AMAZON));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.AWSStaticCredentialsProvider;
+import com.amazonaws.regions.Regions;
+import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
+import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
+import com.amazonaws.services.dynamodbv2.document.DynamoDB;
+
+/**
+ * Shallow wrapper for Dynamo DB wrappers
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public class DynamoDBClient {
+ private final Logger logger = LoggerFactory.getLogger(DynamoDBClient.class);
+ private DynamoDB dynamo;
+ private AmazonDynamoDB client;
+
+ public DynamoDBClient(AWSCredentials credentials, Regions region) {
+ client = AmazonDynamoDBClientBuilder.standard().withRegion(region)
+ .withCredentials(new AWSStaticCredentialsProvider(credentials)).build();
+ dynamo = new DynamoDB(client);
+ }
+
+ public DynamoDBClient(DynamoDBConfig clientConfig) {
+ this(clientConfig.getCredentials(), clientConfig.getRegion());
+ }
+
+ public AmazonDynamoDB getDynamoClient() {
+ return client;
+ }
+
+ public DynamoDB getDynamoDB() {
+ return dynamo;
+ }
+
+ public void shutdown() {
+ dynamo.shutdown();
+ }
+
+ public boolean checkConnection() {
+ try {
+ dynamo.listTables(1).firstPage();
+ } catch (Exception e) {
+ logger.warn("Got internal server error when trying to list tables: {}", e.getMessage());
+ return false;
+ }
+ return true;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Arrays;
+import java.util.Map;
+import java.util.stream.Collectors;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.auth.AWSCredentials;
+import com.amazonaws.auth.BasicAWSCredentials;
+import com.amazonaws.auth.profile.ProfilesConfigFile;
+import com.amazonaws.regions.Regions;
+
+/**
+ * Configuration for DynamoDB connections
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class DynamoDBConfig {
+ public static final String DEFAULT_TABLE_PREFIX = "openhab-";
+ public static final boolean DEFAULT_CREATE_TABLE_ON_DEMAND = true;
+ public static final long DEFAULT_READ_CAPACITY_UNITS = 1;
+ public static final long DEFAULT_WRITE_CAPACITY_UNITS = 1;
+ public static final long DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS = 1000;
+ public static final int DEFAULT_BUFFER_SIZE = 1000;
+
+ private static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBConfig.class);
+
+ private String tablePrefix = DEFAULT_TABLE_PREFIX;
+ private Regions region;
+ private AWSCredentials credentials;
+ private boolean createTable = DEFAULT_CREATE_TABLE_ON_DEMAND;
+ private long readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
+ private long writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
+ private long bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
+ private int bufferSize = DEFAULT_BUFFER_SIZE;
+
+ /**
+ *
+ * @param config persistence service configuration
+ * @return DynamoDB configuration. Returns null in case of configuration errors
+ */
+ public static @Nullable DynamoDBConfig fromConfig(Map<String, Object> config) {
+ try {
+ String regionName = (String) config.get("region");
+ if (regionName == null) {
+ return null;
+ }
+ final Regions region;
+ try {
+ region = Regions.fromName(regionName);
+ } catch (IllegalArgumentException e) {
+ LOGGER.error("Specify valid AWS region to use, got {}. Valid values include: {}", regionName, Arrays
+ .asList(Regions.values()).stream().map(r -> r.getName()).collect(Collectors.joining(",")));
+ return null;
+ }
+
+ AWSCredentials credentials;
+ String accessKey = (String) config.get("accessKey");
+ String secretKey = (String) config.get("secretKey");
+ if (accessKey != null && !accessKey.isBlank() && secretKey != null && !secretKey.isBlank()) {
+ LOGGER.debug("accessKey and secretKey specified. Using those.");
+ credentials = new BasicAWSCredentials(accessKey, secretKey);
+ } else {
+ LOGGER.debug("accessKey and/or secretKey blank. Checking profilesConfigFile and profile.");
+ String profilesConfigFile = (String) config.get("profilesConfigFile");
+ String profile = (String) config.get("profile");
+ if (profilesConfigFile == null || profilesConfigFile.isBlank() || profile == null
+ || profile.isBlank()) {
+ LOGGER.error("Specify either 1) accessKey and secretKey; or 2) profilesConfigFile and "
+ + "profile for providing AWS credentials");
+ return null;
+ }
+ credentials = new ProfilesConfigFile(profilesConfigFile).getCredentials(profile);
+ }
+
+ String table = (String) config.get("tablePrefix");
+ if (table == null || table.isBlank()) {
+ LOGGER.debug("Using default table name {}", DEFAULT_TABLE_PREFIX);
+ table = DEFAULT_TABLE_PREFIX;
+ }
+
+ final boolean createTable;
+ String createTableParam = (String) config.get("createTable");
+ if (createTableParam == null || createTableParam.isBlank()) {
+ LOGGER.debug("Creating table on demand: {}", DEFAULT_CREATE_TABLE_ON_DEMAND);
+ createTable = DEFAULT_CREATE_TABLE_ON_DEMAND;
+ } else {
+ createTable = Boolean.parseBoolean(createTableParam);
+ }
+
+ final long readCapacityUnits;
+ String readCapacityUnitsParam = (String) config.get("readCapacityUnits");
+ if (readCapacityUnitsParam == null || readCapacityUnitsParam.isBlank()) {
+ LOGGER.debug("Read capacity units: {}", DEFAULT_READ_CAPACITY_UNITS);
+ readCapacityUnits = DEFAULT_READ_CAPACITY_UNITS;
+ } else {
+ readCapacityUnits = Long.parseLong(readCapacityUnitsParam);
+ }
+
+ final long writeCapacityUnits;
+ String writeCapacityUnitsParam = (String) config.get("writeCapacityUnits");
+ if (writeCapacityUnitsParam == null || writeCapacityUnitsParam.isBlank()) {
+ LOGGER.debug("Write capacity units: {}", DEFAULT_WRITE_CAPACITY_UNITS);
+ writeCapacityUnits = DEFAULT_WRITE_CAPACITY_UNITS;
+ } else {
+ writeCapacityUnits = Long.parseLong(writeCapacityUnitsParam);
+ }
+
+ final long bufferCommitIntervalMillis;
+ String bufferCommitIntervalMillisParam = (String) config.get("bufferCommitIntervalMillis");
+ if (bufferCommitIntervalMillisParam == null || bufferCommitIntervalMillisParam.isBlank()) {
+ LOGGER.debug("Buffer commit interval millis: {}", DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS);
+ bufferCommitIntervalMillis = DEFAULT_BUFFER_COMMIT_INTERVAL_MILLIS;
+ } else {
+ bufferCommitIntervalMillis = Long.parseLong(bufferCommitIntervalMillisParam);
+ }
+
+ final int bufferSize;
+ String bufferSizeParam = (String) config.get("bufferSize");
+ if (bufferSizeParam == null || bufferSizeParam.isBlank()) {
+ LOGGER.debug("Buffer size: {}", DEFAULT_BUFFER_SIZE);
+ bufferSize = DEFAULT_BUFFER_SIZE;
+ } else {
+ bufferSize = Integer.parseInt(bufferSizeParam);
+ }
+
+ return new DynamoDBConfig(region, credentials, table, createTable, readCapacityUnits, writeCapacityUnits,
+ bufferCommitIntervalMillis, bufferSize);
+ } catch (Exception e) {
+ LOGGER.error("Error with configuration", e);
+ return null;
+ }
+ }
+
+ public DynamoDBConfig(Regions region, AWSCredentials credentials, String table, boolean createTable,
+ long readCapacityUnits, long writeCapacityUnits, long bufferCommitIntervalMillis, int bufferSize) {
+ this.region = region;
+ this.credentials = credentials;
+ this.tablePrefix = table;
+ this.createTable = createTable;
+ this.readCapacityUnits = readCapacityUnits;
+ this.writeCapacityUnits = writeCapacityUnits;
+ this.bufferCommitIntervalMillis = bufferCommitIntervalMillis;
+ this.bufferSize = bufferSize;
+ }
+
+ public AWSCredentials getCredentials() {
+ return credentials;
+ }
+
+ public String getTablePrefix() {
+ return tablePrefix;
+ }
+
+ public Regions getRegion() {
+ return region;
+ }
+
+ public boolean isCreateTable() {
+ return createTable;
+ }
+
+ public long getReadCapacityUnits() {
+ return readCapacityUnits;
+ }
+
+ public long getWriteCapacityUnits() {
+ return writeCapacityUnits;
+ }
+
+ public long getBufferCommitIntervalMillis() {
+ return bufferCommitIntervalMillis;
+ }
+
+ public int getBufferSize() {
+ return bufferSize;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.text.DateFormat;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is a Java bean used to return historic items from Dynamodb.
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class DynamoDBHistoricItem implements HistoricItem {
+ private final String name;
+ private final State state;
+ private final ZonedDateTime timestamp;
+
+ public DynamoDBHistoricItem(String name, State state, ZonedDateTime timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.HistoricItem;
+
+/**
+ * Represents openHAB Item serialized in a suitable format for the database
+ *
+ * @param <T> Type of the state as accepted by the AWS SDK.
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+public interface DynamoDBItem<T> {
+
+ static final String DATE_FORMAT = "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'";
+
+ static final String ATTRIBUTE_NAME_TIMEUTC = "timeutc";
+
+ static final String ATTRIBUTE_NAME_ITEMNAME = "itemname";
+
+ static final String ATTRIBUTE_NAME_ITEMSTATE = "itemstate";
+
+ /**
+ * Convert this AbstractDynamoItem as HistoricItem.
+ *
+ * @param item Item representing this item. Used to determine item type.
+ * @return HistoricItem representing this DynamoDBItem.
+ */
+ HistoricItem asHistoricItem(Item item);
+
+ String getName();
+
+ T getState();
+
+ ZonedDateTime getTime();
+
+ void setName(String name);
+
+ void setState(T state);
+
+ void setTime(ZonedDateTime time);
+
+ void accept(DynamoDBItemVisitor visitor);
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Visitor for DynamoDBItem
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public interface DynamoDBItemVisitor {
+
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem);
+
+ public void visit(DynamoDBStringItem dynamoStringItem);
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+import java.util.ArrayDeque;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Deque;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Map.Entry;
+import java.util.Set;
+import java.util.concurrent.Executors;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+import java.util.function.Function;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.common.NamedThreadFactory;
+import org.openhab.core.config.core.ConfigurableService;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.osgi.framework.BundleContext;
+import org.osgi.framework.Constants;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.AmazonClientException;
+import com.amazonaws.AmazonServiceException;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.FailedBatch;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.PaginationLoadingStrategy;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression;
+import com.amazonaws.services.dynamodbv2.datamodeling.PaginatedQueryList;
+import com.amazonaws.services.dynamodbv2.document.BatchWriteItemOutcome;
+import com.amazonaws.services.dynamodbv2.model.CreateTableRequest;
+import com.amazonaws.services.dynamodbv2.model.GlobalSecondaryIndex;
+import com.amazonaws.services.dynamodbv2.model.ProvisionedThroughput;
+import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException;
+import com.amazonaws.services.dynamodbv2.model.TableDescription;
+import com.amazonaws.services.dynamodbv2.model.TableStatus;
+import com.amazonaws.services.dynamodbv2.model.WriteRequest;
+
+/**
+ * This is the implementation of the DynamoDB {@link PersistenceService}. It persists item values
+ * using the <a href="https://aws.amazon.com/dynamodb/">Amazon DynamoDB</a> database. The states (
+ * {@link State}) of an {@link Item} are persisted in DynamoDB tables.
+ *
+ * The service creates tables automatically, one for numbers, and one for strings.
+ *
+ * @see AbstractDynamoDBItem.fromState for details how different items are persisted
+ *
+ * @author Sami Salonen - Initial contribution
+ * @author Kai Kreuzer - Migration to 3.x
+ *
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.dynamodb", //
+ property = Constants.SERVICE_PID + "=org.openhab.dynamodb")
+@ConfigurableService(category = "persistence", label = "DynamoDB Persistence Service", description_uri = DynamoDBPersistenceService.CONFIG_URI)
+public class DynamoDBPersistenceService extends AbstractBufferedPersistenceService<DynamoDBItem<?>>
+ implements QueryablePersistenceService {
+
+ protected static final String CONFIG_URI = "persistence:dynamodb";
+
+ private class ExponentialBackoffRetry implements Runnable {
+ private int retry;
+ private Map<String, List<WriteRequest>> unprocessedItems;
+ private @Nullable Exception lastException;
+
+ public ExponentialBackoffRetry(Map<String, List<WriteRequest>> unprocessedItems) {
+ this.unprocessedItems = unprocessedItems;
+ }
+
+ @Override
+ public void run() {
+ logger.debug("Error storing object to dynamo, unprocessed items: {}. Retrying with exponential back-off",
+ unprocessedItems);
+ lastException = null;
+ while (!unprocessedItems.isEmpty() && retry < WAIT_MILLIS_IN_RETRIES.length) {
+ if (!sleep()) {
+ // Interrupted
+ return;
+ }
+ retry++;
+ try {
+ BatchWriteItemOutcome outcome = DynamoDBPersistenceService.this.db.getDynamoDB()
+ .batchWriteItemUnprocessed(unprocessedItems);
+ unprocessedItems = outcome.getUnprocessedItems();
+ lastException = null;
+ } catch (AmazonServiceException e) {
+ if (e instanceof ResourceNotFoundException) {
+ logger.debug(
+ "DynamoDB query raised unexpected exception: {}. This might happen if table was recently created",
+ e.getMessage());
+ } else {
+ logger.debug("DynamoDB query raised unexpected exception: {}.", e.getMessage());
+ }
+ lastException = e;
+ continue;
+ }
+ }
+ if (unprocessedItems.isEmpty()) {
+ logger.debug("After {} retries successfully wrote all unprocessed items", retry);
+ } else {
+ logger.warn(
+ "Even after retries failed to write some items. Last exception: {} {}, unprocessed items: {}",
+ lastException == null ? "null" : lastException.getClass().getName(),
+ lastException == null ? "null" : lastException.getMessage(), unprocessedItems);
+ }
+ }
+
+ private boolean sleep() {
+ try {
+ long sleepTime;
+ if (retry == 1 && lastException != null && lastException instanceof ResourceNotFoundException) {
+ sleepTime = WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS;
+ } else {
+ sleepTime = WAIT_MILLIS_IN_RETRIES[retry];
+ }
+ Thread.sleep(sleepTime);
+ return true;
+ } catch (InterruptedException e) {
+ logger.debug("Interrupted while writing data!");
+ return false;
+ }
+ }
+
+ public Map<String, List<WriteRequest>> getUnprocessedItems() {
+ return unprocessedItems;
+ }
+ }
+
+ private static final int WAIT_ON_FIRST_RESOURCE_NOT_FOUND_MILLIS = 5000;
+ private static final int[] WAIT_MILLIS_IN_RETRIES = new int[] { 100, 100, 200, 300, 500 };
+ private static final String DYNAMODB_THREADPOOL_NAME = "dynamodbPersistenceService";
+
+ private final ItemRegistry itemRegistry;
+ private @Nullable DynamoDBClient db;
+ private final Logger logger = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
+ private boolean isProperlyConfigured;
+ private @NonNullByDefault({}) DynamoDBConfig dbConfig;
+ private @NonNullByDefault({}) DynamoDBTableNameResolver tableNameResolver;
+ private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1,
+ new NamedThreadFactory(DYNAMODB_THREADPOOL_NAME));
+ private @Nullable ScheduledFuture<?> writeBufferedDataFuture;
+
+ @Activate
+ public DynamoDBPersistenceService(final @Reference ItemRegistry itemRegistry) {
+ this.itemRegistry = itemRegistry;
+ }
+
+ /**
+ * For testing. Allows access to underlying DynamoDBClient.
+ *
+ * @return DynamoDBClient connected to AWS Dyanamo DB.
+ */
+ @Nullable
+ DynamoDBClient getDb() {
+ return db;
+ }
+
+ @Activate
+ public void activate(final @Nullable BundleContext bundleContext, final Map<String, Object> config) {
+ resetClient();
+ dbConfig = DynamoDBConfig.fromConfig(config);
+ if (dbConfig == null) {
+ // Configuration was invalid. Abort service activation.
+ // Error is already logger in fromConfig.
+ return;
+ }
+
+ tableNameResolver = new DynamoDBTableNameResolver(dbConfig.getTablePrefix());
+ try {
+ if (!ensureClient()) {
+ logger.error("Error creating dynamodb database client. Aborting service activation.");
+ return;
+ }
+ } catch (Exception e) {
+ logger.error("Error constructing dynamodb client", e);
+ return;
+ }
+
+ writeBufferedDataFuture = null;
+ resetWithBufferSize(dbConfig.getBufferSize());
+ long commitIntervalMillis = dbConfig.getBufferCommitIntervalMillis();
+ if (commitIntervalMillis > 0) {
+ writeBufferedDataFuture = scheduler.scheduleWithFixedDelay(new Runnable() {
+ @Override
+ public void run() {
+ try {
+ DynamoDBPersistenceService.this.flushBufferedData();
+ } catch (RuntimeException e) {
+ // We want to catch all unexpected exceptions since all unhandled exceptions make
+ // ScheduledExecutorService halt the regular running of the task.
+ // It is better to print out the exception, and try again
+ // (on next cycle)
+ logger.warn(
+ "Execution of scheduled flushing of buffered data failed unexpectedly. Ignoring exception, trying again according to configured commit interval of {} ms.",
+ commitIntervalMillis, e);
+ }
+ }
+ }, 0, commitIntervalMillis, TimeUnit.MILLISECONDS);
+ }
+ isProperlyConfigured = true;
+ logger.debug("dynamodb persistence service activated");
+ }
+
+ @Deactivate
+ public void deactivate() {
+ logger.debug("dynamodb persistence service deactivated");
+ if (writeBufferedDataFuture != null) {
+ writeBufferedDataFuture.cancel(false);
+ writeBufferedDataFuture = null;
+ }
+ resetClient();
+ }
+
+ /**
+ * Initializes DynamoDBClient (db field)
+ *
+ * If DynamoDBClient constructor throws an exception, error is logged and false is returned.
+ *
+ * @return whether initialization was successful.
+ */
+ private boolean ensureClient() {
+ if (db == null) {
+ try {
+ db = new DynamoDBClient(dbConfig);
+ } catch (Exception e) {
+ logger.error("Error constructing dynamodb client", e);
+ return false;
+ }
+ }
+ return true;
+ }
+
+ @Override
+ public DynamoDBItem<?> persistenceItemFromState(String name, State state, ZonedDateTime time) {
+ return AbstractDynamoDBItem.fromState(name, state, time);
+ }
+
+ /**
+ * Create table (if not present) and wait for table to become active.
+ *
+ * Synchronized in order to ensure that at most single thread is creating the table at a time
+ *
+ * @param mapper
+ * @param dtoClass
+ * @return whether table creation succeeded.
+ */
+ private synchronized boolean createTable(DynamoDBMapper mapper, Class<?> dtoClass) {
+ if (db == null) {
+ return false;
+ }
+ String tableName;
+ try {
+ ProvisionedThroughput provisionedThroughput = new ProvisionedThroughput(dbConfig.getReadCapacityUnits(),
+ dbConfig.getWriteCapacityUnits());
+ CreateTableRequest request = mapper.generateCreateTableRequest(dtoClass);
+ request.setProvisionedThroughput(provisionedThroughput);
+ if (request.getGlobalSecondaryIndexes() != null) {
+ for (GlobalSecondaryIndex index : request.getGlobalSecondaryIndexes()) {
+ index.setProvisionedThroughput(provisionedThroughput);
+ }
+ }
+ tableName = request.getTableName();
+ try {
+ db.getDynamoClient().describeTable(tableName);
+ } catch (ResourceNotFoundException e) {
+ // No table present, continue with creation
+ db.getDynamoClient().createTable(request);
+ } catch (AmazonClientException e) {
+ logger.error("Table creation failed due to error in describeTable operation", e);
+ return false;
+ }
+
+ // table found or just created, wait
+ return waitForTableToBecomeActive(tableName);
+ } catch (AmazonClientException e) {
+ logger.error("Exception when creating table", e);
+ return false;
+ }
+ }
+
+ private boolean waitForTableToBecomeActive(String tableName) {
+ try {
+ logger.debug("Checking if table '{}' is created...", tableName);
+ final TableDescription tableDescription;
+ try {
+ tableDescription = db.getDynamoDB().getTable(tableName).waitForActive();
+ } catch (IllegalArgumentException e) {
+ logger.warn("Table '{}' is being deleted: {} {}", tableName, e.getClass().getSimpleName(),
+ e.getMessage());
+ return false;
+ } catch (ResourceNotFoundException e) {
+ logger.warn("Table '{}' was deleted unexpectedly: {} {}", tableName, e.getClass().getSimpleName(),
+ e.getMessage());
+ return false;
+ }
+ boolean success = TableStatus.ACTIVE.equals(TableStatus.fromValue(tableDescription.getTableStatus()));
+ if (success) {
+ logger.debug("Creation of table '{}' successful, table status is now {}", tableName,
+ tableDescription.getTableStatus());
+ } else {
+ logger.warn("Creation of table '{}' unsuccessful, table status is now {}", tableName,
+ tableDescription.getTableStatus());
+ }
+ return success;
+ } catch (AmazonClientException e) {
+ logger.error("Exception when checking table status (describe): {}", e.getMessage());
+ return false;
+ } catch (InterruptedException e) {
+ logger.error("Interrupted while trying to check table status: {}", e.getMessage());
+ return false;
+ }
+ }
+
+ private void resetClient() {
+ if (db == null) {
+ return;
+ }
+ db.shutdown();
+ db = null;
+ dbConfig = null;
+ tableNameResolver = null;
+ isProperlyConfigured = false;
+ }
+
+ private DynamoDBMapper getDBMapper(String tableName) {
+ try {
+ DynamoDBMapperConfig mapperConfig = new DynamoDBMapperConfig.Builder()
+ .withTableNameOverride(new DynamoDBMapperConfig.TableNameOverride(tableName))
+ .withPaginationLoadingStrategy(PaginationLoadingStrategy.LAZY_LOADING).build();
+ return new DynamoDBMapper(db.getDynamoClient(), mapperConfig);
+ } catch (AmazonClientException e) {
+ logger.error("Error getting db mapper: {}", e.getMessage());
+ throw e;
+ }
+ }
+
+ @Override
+ protected boolean isReadyToStore() {
+ return isProperlyConfigured && ensureClient();
+ }
+
+ @Override
+ public String getId() {
+ return "dynamodb";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "DynamoDB";
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ @Override
+ protected void flushBufferedData() {
+ if (buffer != null && buffer.isEmpty()) {
+ return;
+ }
+ logger.debug("Writing buffered data. Buffer size: {}", buffer.size());
+
+ for (;;) {
+ Map<String, Deque<DynamoDBItem<?>>> itemsByTable = readBuffer();
+ // Write batch of data, one table at a time
+ for (Entry<String, Deque<DynamoDBItem<?>>> entry : itemsByTable.entrySet()) {
+ String tableName = entry.getKey();
+ Deque<DynamoDBItem<?>> batch = entry.getValue();
+ if (!batch.isEmpty()) {
+ flushBatch(getDBMapper(tableName), batch);
+ }
+ }
+ if (buffer != null && buffer.isEmpty()) {
+ break;
+ }
+ }
+ }
+
+ private Map<String, Deque<DynamoDBItem<?>>> readBuffer() {
+ Map<String, Deque<DynamoDBItem<?>>> batchesByTable = new HashMap<>(2);
+ // Get batch of data
+ while (!buffer.isEmpty()) {
+ DynamoDBItem<?> dynamoItem = buffer.poll();
+ if (dynamoItem == null) {
+ break;
+ }
+ String tableName = tableNameResolver.fromItem(dynamoItem);
+ Deque<DynamoDBItem<?>> batch = batchesByTable.computeIfAbsent(tableName, new Function<>() {
+ @Override
+ public Deque<DynamoDBItem<?>> apply(String t) {
+ return new ArrayDeque<>();
+ }
+ });
+ batch.add(dynamoItem);
+ }
+ return batchesByTable;
+ }
+
+ /**
+ * Flush batch of data to DynamoDB
+ *
+ * @param mapper mapper associated with the batch
+ * @param batch batch of data to write to DynamoDB
+ */
+ private void flushBatch(DynamoDBMapper mapper, Deque<DynamoDBItem<?>> batch) {
+ long currentTimeMillis = System.currentTimeMillis();
+ List<FailedBatch> failed = mapper.batchSave(batch);
+ for (FailedBatch failedBatch : failed) {
+ if (failedBatch.getException() instanceof ResourceNotFoundException) {
+ // Table did not exist. Try again after creating table
+ retryFlushAfterCreatingTable(mapper, batch, failedBatch);
+ } else {
+ logger.debug("Batch failed with {}. Retrying next with exponential back-off",
+ failedBatch.getException().getMessage());
+ new ExponentialBackoffRetry(failedBatch.getUnprocessedItems()).run();
+ }
+ }
+ if (failed.isEmpty()) {
+ logger.debug("flushBatch ended with {} items in {} ms: {}", batch.size(),
+ System.currentTimeMillis() - currentTimeMillis, batch);
+ } else {
+ logger.warn(
+ "flushBatch ended with {} items in {} ms: {}. There were some failed batches that were retried -- check logs for ERRORs to see if writes were successful",
+ batch.size(), System.currentTimeMillis() - currentTimeMillis, batch);
+ }
+ }
+
+ /**
+ * Retry flushing data after creating table associated with mapper
+ *
+ * @param mapper mapper associated with the batch
+ * @param batch original batch of data. Used for logging and to determine table name
+ * @param failedBatch failed batch that should be retried
+ */
+ private void retryFlushAfterCreatingTable(DynamoDBMapper mapper, Deque<DynamoDBItem<?>> batch,
+ FailedBatch failedBatch) {
+ logger.debug("Table was not found. Trying to create table and try saving again");
+ if (createTable(mapper, batch.peek().getClass())) {
+ logger.debug("Table creation successful, trying to save again");
+ if (!failedBatch.getUnprocessedItems().isEmpty()) {
+ ExponentialBackoffRetry retry = new ExponentialBackoffRetry(failedBatch.getUnprocessedItems());
+ retry.run();
+ if (retry.getUnprocessedItems().isEmpty()) {
+ logger.debug("Successfully saved items after table creation");
+ }
+ }
+ } else {
+ logger.warn("Table creation failed. Not storing some parts of batch: {}. Unprocessed items: {}", batch,
+ failedBatch.getUnprocessedItems());
+ }
+ }
+
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ logger.debug("got a query");
+ if (!isProperlyConfigured) {
+ logger.debug("Configuration for dynamodb not yet loaded or broken. Not storing item.");
+ return Collections.emptyList();
+ }
+ if (!ensureClient()) {
+ logger.warn("DynamoDB not connected. Not storing item.");
+ return Collections.emptyList();
+ }
+
+ String itemName = filter.getItemName();
+ Item item = getItemFromRegistry(itemName);
+ if (item == null) {
+ logger.warn("Could not get item {} from registry!", itemName);
+ return Collections.emptyList();
+ }
+
+ Class<DynamoDBItem<?>> dtoClass = AbstractDynamoDBItem.getDynamoItemClass(item.getClass());
+ String tableName = tableNameResolver.fromClass(dtoClass);
+ DynamoDBMapper mapper = getDBMapper(tableName);
+ logger.debug("item {} (class {}) will be tried to query using dto class {} from table {}", itemName,
+ item.getClass(), dtoClass, tableName);
+
+ List<HistoricItem> historicItems = new ArrayList<>();
+
+ DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression = DynamoDBQueryUtils.createQueryExpression(dtoClass,
+ filter);
+ @SuppressWarnings("rawtypes")
+ final PaginatedQueryList<? extends DynamoDBItem> paginatedList;
+ try {
+ paginatedList = mapper.query(dtoClass, queryExpression);
+ } catch (AmazonServiceException e) {
+ logger.error(
+ "DynamoDB query raised unexpected exception: {}. Returning empty collection. "
+ + "Status code 400 (resource not found) might occur if table was just created.",
+ e.getMessage());
+ return Collections.emptyList();
+ }
+ for (int itemIndexOnPage = 0; itemIndexOnPage < filter.getPageSize(); itemIndexOnPage++) {
+ int itemIndex = filter.getPageNumber() * filter.getPageSize() + itemIndexOnPage;
+ DynamoDBItem<?> dynamoItem;
+ try {
+ dynamoItem = paginatedList.get(itemIndex);
+ } catch (IndexOutOfBoundsException e) {
+ logger.debug("Index {} is out-of-bounds", itemIndex);
+ break;
+ }
+ if (dynamoItem != null) {
+ HistoricItem historicItem = dynamoItem.asHistoricItem(item);
+ logger.trace("Dynamo item {} converted to historic item: {}", item, historicItem);
+ historicItems.add(historicItem);
+ }
+
+ }
+ return historicItems;
+ }
+
+ /**
+ * Retrieves the item for the given name from the item registry
+ *
+ * @param itemName
+ * @return item with the given name, or null if no such item exists in item registry.
+ */
+ private @Nullable Item getItemFromRegistry(String itemName) {
+ Item item = null;
+ try {
+ if (itemRegistry != null) {
+ item = itemRegistry.getItem(itemName);
+ }
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item {} from registry", itemName);
+ }
+ return item;
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+import java.util.Collections;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBQueryExpression;
+import com.amazonaws.services.dynamodbv2.model.AttributeValue;
+import com.amazonaws.services.dynamodbv2.model.ComparisonOperator;
+import com.amazonaws.services.dynamodbv2.model.Condition;
+
+/**
+ * Utility class
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class DynamoDBQueryUtils {
+ /**
+ * Construct dynamodb query from filter
+ *
+ * @param filter
+ * @return DynamoDBQueryExpression corresponding to the given FilterCriteria
+ */
+ public static DynamoDBQueryExpression<DynamoDBItem<?>> createQueryExpression(
+ Class<? extends DynamoDBItem<?>> dtoClass, FilterCriteria filter) {
+ DynamoDBItem<?> item = getDynamoDBHashKey(dtoClass, filter.getItemName());
+ final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression = new DynamoDBQueryExpression<DynamoDBItem<?>>()
+ .withHashKeyValues(item).withScanIndexForward(filter.getOrdering() == Ordering.ASCENDING)
+ .withLimit(filter.getPageSize());
+ maybeAddTimeFilter(queryExpression, filter);
+ maybeAddStateFilter(filter, queryExpression);
+ return queryExpression;
+ }
+
+ private static DynamoDBItem<?> getDynamoDBHashKey(Class<? extends DynamoDBItem<?>> dtoClass, String itemName) {
+ DynamoDBItem<?> item;
+ try {
+ item = dtoClass.newInstance();
+ } catch (InstantiationException e) {
+ throw new RuntimeException(e);
+ } catch (IllegalAccessException e) {
+ throw new RuntimeException(e);
+ }
+ item.setName(itemName);
+ return item;
+ }
+
+ private static void maybeAddStateFilter(FilterCriteria filter,
+ final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression) {
+ if (filter.getOperator() != null && filter.getState() != null) {
+ // Convert filter's state to DynamoDBItem in order get suitable string representation for the state
+ final DynamoDBItem<?> filterState = AbstractDynamoDBItem.fromState(filter.getItemName(), filter.getState(),
+ ZonedDateTime.now());
+ queryExpression.setFilterExpression(String.format("%s %s :opstate", DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE,
+ operatorAsString(filter.getOperator())));
+
+ filterState.accept(new DynamoDBItemVisitor() {
+
+ @Override
+ public void visit(DynamoDBStringItem dynamoStringItem) {
+ queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate",
+ new AttributeValue().withS(dynamoStringItem.getState())));
+ }
+
+ @Override
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
+ queryExpression.setExpressionAttributeValues(Collections.singletonMap(":opstate",
+ new AttributeValue().withN(dynamoBigDecimalItem.getState().toPlainString())));
+ }
+ });
+ }
+ }
+
+ private static @Nullable Condition maybeAddTimeFilter(
+ final DynamoDBQueryExpression<DynamoDBItem<?>> queryExpression, final FilterCriteria filter) {
+ final Condition timeCondition = constructTimeCondition(filter);
+ if (timeCondition != null) {
+ queryExpression.setRangeKeyConditions(
+ Collections.singletonMap(DynamoDBItem.ATTRIBUTE_NAME_TIMEUTC, timeCondition));
+ }
+ return timeCondition;
+ }
+
+ private static @Nullable Condition constructTimeCondition(FilterCriteria filter) {
+ boolean hasBegin = filter.getBeginDate() != null;
+ boolean hasEnd = filter.getEndDate() != null;
+
+ final Condition timeCondition;
+ if (!hasBegin && !hasEnd) {
+ timeCondition = null;
+ } else if (hasBegin && !hasEnd) {
+ timeCondition = new Condition().withComparisonOperator(ComparisonOperator.GE).withAttributeValueList(
+ new AttributeValue().withS(filter.getBeginDate().format(AbstractDynamoDBItem.DATEFORMATTER)));
+ } else if (!hasBegin && hasEnd) {
+ timeCondition = new Condition().withComparisonOperator(ComparisonOperator.LE).withAttributeValueList(
+ new AttributeValue().withS(filter.getEndDate().format(AbstractDynamoDBItem.DATEFORMATTER)));
+ } else {
+ timeCondition = new Condition().withComparisonOperator(ComparisonOperator.BETWEEN).withAttributeValueList(
+ new AttributeValue().withS(filter.getBeginDate().format(AbstractDynamoDBItem.DATEFORMATTER)),
+ new AttributeValue().withS(filter.getEndDate().format(AbstractDynamoDBItem.DATEFORMATTER)));
+ }
+ return timeCondition;
+ }
+
+ /**
+ * Convert op to string suitable for dynamodb filter expression
+ *
+ * @param op
+ * @return string representation corresponding to the given the Operator
+ */
+ private static String operatorAsString(Operator op) {
+ switch (op) {
+ case EQ:
+ return "=";
+ case NEQ:
+ return "<>";
+ case LT:
+ return "<";
+ case LTE:
+ return "<=";
+ case GT:
+ return ">";
+ case GTE:
+ return ">=";
+
+ default:
+ throw new IllegalStateException("Unknown operator " + op);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBDocument;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
+import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
+
+/**
+ * DynamoDBItem for items that can be serialized as DynamoDB string
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@DynamoDBDocument
+public class DynamoDBStringItem extends AbstractDynamoDBItem<String> {
+
+ public DynamoDBStringItem() {
+ this(null, null, null);
+ }
+
+ public DynamoDBStringItem(String name, String state, ZonedDateTime time) {
+ super(name, state, time);
+ }
+
+ @DynamoDBAttribute(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMSTATE)
+ @Override
+ public String getState() {
+ return state;
+ }
+
+ @DynamoDBHashKey(attributeName = DynamoDBItem.ATTRIBUTE_NAME_ITEMNAME)
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ @DynamoDBRangeKey(attributeName = ATTRIBUTE_NAME_TIMEUTC)
+ public ZonedDateTime getTime() {
+ return time;
+ }
+
+ @Override
+ public void accept(org.openhab.persistence.dynamodb.internal.DynamoDBItemVisitor visitor) {
+ visitor.visit(this);
+ }
+
+ @Override
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public void setState(String state) {
+ this.state = state;
+ }
+
+ @Override
+ public void setTime(ZonedDateTime time) {
+ this.time = time;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+/**
+ * The DynamoDBTableNameResolver resolves DynamoDB table name for a given item.
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+public class DynamoDBTableNameResolver {
+
+ private final String tablePrefix;
+
+ public DynamoDBTableNameResolver(String tablePrefix) {
+ this.tablePrefix = tablePrefix;
+ }
+
+ public String fromItem(DynamoDBItem<?> item) {
+ final String[] tableName = new String[1];
+
+ // Use the visitor pattern to deduce the table name
+ item.accept(new DynamoDBItemVisitor() {
+
+ @Override
+ public void visit(DynamoDBBigDecimalItem dynamoBigDecimalItem) {
+ tableName[0] = tablePrefix + "bigdecimal";
+ }
+
+ @Override
+ public void visit(DynamoDBStringItem dynamoStringItem) {
+ tableName[0] = tablePrefix + "string";
+ }
+ });
+ return tableName[0];
+ }
+
+ /**
+ * Construct DynamoDBTableNameResolver corresponding to DynamoDBItem class
+ *
+ * @param clazz
+ * @return
+ */
+ public String fromClass(Class<? extends DynamoDBItem<?>> clazz) {
+ DynamoDBItem<?> dummy;
+ try {
+ // Construct new instance of this class (assuming presense no-argument constructor)
+ // in order to re-use fromItem(DynamoDBItem) constructor
+ dummy = clazz.getConstructor().newInstance();
+ } catch (Exception e) {
+ throw new IllegalStateException(String.format("Could not find suitable constructor for class %s", clazz));
+ }
+ return this.fromItem(dummy);
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<config-description:config-descriptions
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:config-description="https://openhab.org/schemas/config-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/config-description/v1.0.0
+ https://openhab.org/schemas/config-description-1.0.0.xsd">
+
+ <config-description uri="persistence:dynamodb">
+
+ <!--
+ ############################ Amazon DynamoDB Persistence Service ##################################
+ #
+ # The following parameters are used to configure Amazon DynamoDB Persistence.
+ #
+ # Further details at https://docs.openhab.org/addons/persistence/dynamodb/readme.html
+ #
+
+ #
+ # CONNECTION SETTINGS (follow OPTION 1 or OPTION 2)
+ #
+
+ # OPTION 1 (using accessKey and secretKey)
+ #accessKey=AKIAIOSFODNN7EXAMPLE
+ #secretKey=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH
+ #region=eu-west-1
+
+ # OPTION 2 (using profilesConfigFile and profile)
+ # where profilesConfigFile points to AWS credentials file
+ #profilesConfigFile=/etc/openhab2/aws_creds
+ #profile=fooprofile
+ #region=eu-west-1
+
+ # Credentials file example:
+ #
+ # [fooprofile]
+ # aws_access_key_id=AKIAIOSFODNN7EXAMPLE
+ # aws_secret_access_key=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH
+
+
+ #
+ # ADVANCED CONFIGURATION (OPTIONAL)
+ #
+
+ # read capacity for the created tables
+ #readCapacityUnits=1
+
+ # write capacity for the created tables
+ #writeCapacityUnits=1
+
+ # table prefix used in the name of created tables
+ #tablePrefix=openhab-
+ -->
+
+ <parameter name="region" type="text" required="true">
+ <label>AWS region ID</label>
+ <description><![CDATA[AWS region ID as described in Step 2 in Setting up Amazon account.<br />
+ The region needs to match the region of the AWS user that will access Amazon DynamoDB.<br />
+ For example, eu-west-1.]]></description>
+ </parameter>
+
+ <parameter name="accessKey" type="text" required="false">
+ <label>AWS access key</label>
+ <description><![CDATA[AWS access key of the AWS user that will access Amazon DynamoDB.
+ <br />
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]></description>
+ </parameter>
+
+ <parameter name="secretKey" type="text" required="false">
+ <label>AWS secret key</label>
+ <description><![CDATA[AWS secret key of the AWS user that will access Amazon DynamoDB.
+ <br />
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]></description>
+ </parameter>
+
+
+ <parameter name="profilesConfigFile" type="text" required="false">
+ <label>AWS credentials file</label>
+ <description><![CDATA[Path to the AWS credentials file. <br />
+ For example, /etc/openhab2/aws_creds.
+ Please note that the user that runs openHAB must have approriate read rights to the credential file.
+ <br />
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]></description>
+ </parameter>
+
+ <parameter name="profile" type="text" required="false">
+ <label>Profile name</label>
+ <description><![CDATA[Name of the profile to use in AWS credentials file
+ <br />
+ Give either 1) access key and secret key, or 2) credentials file and profile name.
+ ]]></description>
+ </parameter>
+
+
+
+ <parameter name="readCapacityUnits" type="integer" required="false" min="1">
+ <description>Read capacity for the created tables. Default is 1.</description>
+ <label>Read capacity</label>
+ <advanced>true</advanced>
+ </parameter>
+
+ <parameter name="writeCapacityUnits" type="integer" required="false" min="1">
+ <label>Write capacity</label>
+ <description>Write capacity for the created tables. Default is 1.</description>
+ <advanced>true</advanced>
+ </parameter>
+
+ <parameter name="tablePrefix" type="text" required="false">
+ <label>Table prefix</label>
+ <description>Table prefix used in the name of created tables. Default is openhab-</description>
+ <advanced>true</advanced>
+ </parameter>
+
+ </config-description>
+
+</config-description:config-descriptions>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.assertEquals;
+
+import java.io.IOException;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.PlayerItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+
+/**
+ * Test for AbstractDynamoDBItem.getDynamoItemClass
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class AbstractDynamoDBItemGetDynamoItemClassTest {
+
+ @Test
+ public void testCallItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(CallItem.class));
+ }
+
+ @Test
+ public void testContactItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(ContactItem.class));
+ }
+
+ @Test
+ public void testDateTimeItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(DateTimeItem.class));
+ }
+
+ @Test
+ public void testStringItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(StringItem.class));
+ }
+
+ @Test
+ public void testLocationItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(LocationItem.class));
+ }
+
+ @Test
+ public void testNumberItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(NumberItem.class));
+ }
+
+ @Test
+ public void testColorItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(ColorItem.class));
+ }
+
+ @Test
+ public void testDimmerItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(DimmerItem.class));
+ }
+
+ @Test
+ public void testPlayerItem() throws IOException {
+ assertEquals(DynamoDBStringItem.class, AbstractDynamoDBItem.getDynamoItemClass(PlayerItem.class));
+ }
+
+ @Test
+ public void testRollershutterItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(RollershutterItem.class));
+ }
+
+ @Test
+ public void testOnOffTypeWithSwitchItem() throws IOException {
+ assertEquals(DynamoDBBigDecimalItem.class, AbstractDynamoDBItem.getDynamoItemClass(SwitchItem.class));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.*;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.time.Instant;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.TimeZone;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.library.types.UpDownType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+
+/**
+ * Test for AbstractDynamoDBItem.fromState and AbstractDynamoDBItem.asHistoricItem for all kind of states
+ *
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class AbstractDynamoDBItemSerializationTest {
+
+ private final ZonedDateTime date = ZonedDateTime.ofInstant(Instant.ofEpochSecond(400), ZoneId.systemDefault());
+
+ /**
+ * Generic function testing serialization of item state to internal format in DB. In other words, conversion of
+ * Item with state to DynamoDBItem
+ *
+ * @param state item state
+ * @param expectedState internal format in DB representing the item state
+ * @return dynamo db item
+ * @throws IOException
+ */
+ public DynamoDBItem<?> testStateGeneric(State state, Object expectedState) throws IOException {
+ DynamoDBItem<?> dbItem = AbstractDynamoDBItem.fromState("item1", state, date);
+
+ assertEquals("item1", dbItem.getName());
+ assertEquals(date, dbItem.getTime());
+ Object actualState = dbItem.getState();
+ if (expectedState instanceof BigDecimal) {
+ BigDecimal expectedRounded = DynamoDBBigDecimalItem.loseDigits(((BigDecimal) expectedState));
+ assertEquals(0, expectedRounded.compareTo((BigDecimal) actualState),
+ String.format("Expected state %s (%s but with some digits lost) did not match actual state %s",
+ expectedRounded, expectedState, actualState));
+ } else {
+ assertEquals(expectedState, actualState);
+ }
+ return dbItem;
+ }
+
+ /**
+ * Test state deserialization, that is DynamoDBItem conversion to HistoricItem
+ *
+ * @param dbItem dynamo db item
+ * @param item parameter for DynamoDBItem.asHistoricItem
+ * @param expectedState Expected state of the historic item. DecimalTypes are compared with reduced accuracy
+ * @return
+ * @throws IOException
+ */
+ public HistoricItem testAsHistoricGeneric(DynamoDBItem<?> dbItem, Item item, Object expectedState)
+ throws IOException {
+ HistoricItem historicItem = dbItem.asHistoricItem(item);
+
+ assertEquals("item1", historicItem.getName());
+ assertEquals(date, historicItem.getTimestamp());
+ assertEquals(expectedState.getClass(), historicItem.getState().getClass());
+ if (expectedState instanceof DecimalType) {
+ // serialization loses accuracy, take this into consideration
+ BigDecimal expectedRounded = DynamoDBBigDecimalItem
+ .loseDigits(((DecimalType) expectedState).toBigDecimal());
+ BigDecimal actual = ((DecimalType) historicItem.getState()).toBigDecimal();
+ assertEquals(0, expectedRounded.compareTo(actual),
+ String.format("Expected state %s (%s but with some digits lost) did not match actual state %s",
+ expectedRounded, expectedState, actual));
+ } else {
+ assertEquals(expectedState, historicItem.getState());
+ }
+ return historicItem;
+ }
+
+ @Test
+ public void testUndefWithNumberItem() throws IOException {
+ final DynamoDBItem<?> dbitem = testStateGeneric(UnDefType.UNDEF, "<org.openhab.core.types.UnDefType.UNDEF>");
+ assertTrue(dbitem instanceof DynamoDBStringItem);
+ testAsHistoricGeneric(dbitem, new NumberItem("foo"), UnDefType.UNDEF);
+ }
+
+ @Test
+ public void testCallTypeWithCallItem() throws IOException {
+ final DynamoDBItem<?> dbitem = testStateGeneric(new StringListType("origNum", "destNum"), "origNum,destNum");
+ testAsHistoricGeneric(dbitem, new CallItem("foo"), new StringListType("origNum", "destNum"));
+ }
+
+ @Test
+ public void testOpenClosedTypeWithContactItem() throws IOException {
+ final DynamoDBItem<?> dbitemOpen = testStateGeneric(OpenClosedType.CLOSED, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOpen, new ContactItem("foo"), OpenClosedType.CLOSED);
+
+ final DynamoDBItem<?> dbitemClosed = testStateGeneric(OpenClosedType.OPEN, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemClosed, new ContactItem("foo"), OpenClosedType.OPEN);
+ }
+
+ @Test
+ public void testDateTimeTypeWithDateTimeItem() throws IOException {
+ ZonedDateTime zdt = ZonedDateTime.parse("2016-05-01T13:46:00.050Z");
+ DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType(zdt.toString()), "2016-05-01T13:46:00.050Z");
+ testAsHistoricGeneric(dbitem, new DateTimeItem("foo"),
+ new DateTimeType(zdt.withZoneSameInstant(ZoneId.systemDefault())));
+ }
+
+ @Test
+ public void testDateTimeTypeWithStringItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType(ZonedDateTime.parse("2016-05-01T13:46:00.050Z")),
+ "2016-05-01T13:46:00.050Z");
+ testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("2016-05-01T13:46:00.050Z"));
+ }
+
+ @Test
+ public void testDateTimeTypeLocalWithDateTimeItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType("2016-07-17T19:38:07.050+0300"),
+ "2016-07-17T16:38:07.050Z");
+
+ ZonedDateTime expectedZdt = Instant.ofEpochMilli(1468773487050L).atZone(ZoneId.systemDefault());
+ testAsHistoricGeneric(dbitem, new DateTimeItem("foo"), new DateTimeType(expectedZdt));
+ }
+
+ @Test
+ public void testDateTimeTypeLocalWithStringItem() throws IOException {
+ Instant instant = Instant.ofEpochMilli(1468773487050L); // GMT: Sun, 17 Jul 2016 16:38:07.050 GMT
+ ZonedDateTime zdt = instant.atZone(TimeZone.getTimeZone("GMT+03:00").toZoneId());
+ DynamoDBItem<?> dbitem = testStateGeneric(new DateTimeType(zdt), "2016-07-17T16:38:07.050Z");
+ testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("2016-07-17T16:38:07.050Z"));
+ }
+
+ @Test
+ public void testPointTypeWithLocationItem() throws IOException {
+ final PointType point = new PointType(new DecimalType(60.3), new DecimalType(30.2), new DecimalType(510.90));
+ String expected = point.getLatitude().toBigDecimal().toString() + ","
+ + point.getLongitude().toBigDecimal().toString() + "," + point.getAltitude().toBigDecimal().toString();
+ DynamoDBItem<?> dbitem = testStateGeneric(point, expected);
+ testAsHistoricGeneric(dbitem, new LocationItem("foo"), point);
+ }
+
+ @Test
+ public void testDecimalTypeWithNumberItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new DecimalType("3.2"), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new NumberItem("foo"), new DecimalType("3.2"));
+ }
+
+ @Test
+ public void testPercentTypeWithColorItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new ColorItem("foo"), new PercentType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testPercentTypeWithDimmerItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new DimmerItem("foo"), new PercentType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testPercentTypeWithRollerShutterItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ testAsHistoricGeneric(dbitem, new RollershutterItem("foo"), new PercentType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testPercentTypeWithNumberItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new PercentType(new BigDecimal("3.2")), new BigDecimal("3.2"));
+ // note: comes back as DecimalType instead of the original PercentType
+ testAsHistoricGeneric(dbitem, new NumberItem("foo"), new DecimalType(new BigDecimal("3.2")));
+ }
+
+ @Test
+ public void testUpDownTypeWithRollershutterItem() throws IOException {
+ // note: comes back as PercentType instead of the original UpDownType
+ DynamoDBItem<?> dbItemDown = testStateGeneric(UpDownType.DOWN, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbItemDown, new RollershutterItem("foo"), new PercentType(BigDecimal.ZERO));
+
+ DynamoDBItem<?> dbItemUp = testStateGeneric(UpDownType.UP, BigDecimal.ONE);
+ testAsHistoricGeneric(dbItemUp, new RollershutterItem("foo"), new PercentType(BigDecimal.ONE));
+ }
+
+ @Test
+ public void testStringTypeWithStringItem() throws IOException {
+ DynamoDBItem<?> dbitem = testStateGeneric(new StringType("foo bar"), "foo bar");
+ testAsHistoricGeneric(dbitem, new StringItem("foo"), new StringType("foo bar"));
+ }
+
+ @Test
+ public void testOnOffTypeWithColorItem() throws IOException {
+ DynamoDBItem<?> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOff, new ColorItem("foo"), new PercentType(BigDecimal.ZERO));
+
+ DynamoDBItem<?> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemOn, new ColorItem("foo"), new PercentType(BigDecimal.ONE));
+ }
+
+ @Test
+ public void testOnOffTypeWithDimmerItem() throws IOException {
+ DynamoDBItem<?> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOff, new DimmerItem("foo"), new PercentType(BigDecimal.ZERO));
+
+ DynamoDBItem<?> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemOn, new DimmerItem("foo"), new PercentType(BigDecimal.ONE));
+ }
+
+ @Test
+ public void testOnOffTypeWithSwitchItem() throws IOException {
+ DynamoDBItem<?> dbitemOff = testStateGeneric(OnOffType.OFF, BigDecimal.ZERO);
+ testAsHistoricGeneric(dbitemOff, new SwitchItem("foo"), OnOffType.OFF);
+
+ DynamoDBItem<?> dbitemOn = testStateGeneric(OnOffType.ON, BigDecimal.ONE);
+ testAsHistoricGeneric(dbitemOn, new SwitchItem("foo"), OnOffType.ON);
+ }
+
+ @Test
+ public void testHSBTypeWithColorItem() throws IOException {
+ HSBType hsb = new HSBType(new DecimalType(1.5), new PercentType(new BigDecimal(2.5)),
+ new PercentType(new BigDecimal(3.5)));
+ DynamoDBItem<?> dbitem = testStateGeneric(hsb, "1.5,2.5,3.5");
+ testAsHistoricGeneric(dbitem, new ColorItem("foo"), hsb);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.*;
+import static org.junit.jupiter.api.Assumptions.assumeTrue;
+
+import java.time.ZonedDateTime;
+import java.util.Iterator;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is abstract class helping with integration testing the persistence service. Different kind of queries are tested
+ * against actual dynamo db database.
+ *
+ *
+ * Inheritor of this base class needs to store two states of one item in a static method annotated with @BeforeClass.
+ * This
+ * static
+ * class should update the private static fields
+ * beforeStore (date before storing anything), afterStore1 (after storing first item, but before storing second item),
+ * afterStore2 (after storing second item). The item name must correspond to getItemName. The first state needs to be
+ * smaller than the second state.
+ *
+ * To have more comprehensive tests, the inheritor class can define getQueryItemStateBetween to provide a value between
+ * the two states. Null can be used to omit the additional tests.
+ *
+ *
+ * See DimmerItemIntegrationTest for example how to use this base class.
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public abstract class AbstractTwoItemIntegrationTest extends BaseIntegrationTest {
+
+ protected static @Nullable ZonedDateTime beforeStore;
+ protected static @Nullable ZonedDateTime afterStore1;
+ protected static @Nullable ZonedDateTime afterStore2;
+
+ protected abstract String getItemName();
+
+ /**
+ * State of the time item stored first, should be smaller than the second value
+ *
+ * @return
+ */
+ protected abstract State getFirstItemState();
+
+ /**
+ * State of the time item stored second, should be larger than the first value
+ *
+ * @return
+ */
+ protected abstract State getSecondItemState();
+
+ /**
+ * State that is between the first and second. Use null to omit extended tests using this value.
+ *
+ * @return
+ */
+ protected abstract @Nullable State getQueryItemStateBetween();
+
+ protected void assertStateEquals(State expected, State actual) {
+ assertEquals(expected, actual);
+ }
+
+ @BeforeAll
+ public static void checkService() throws InterruptedException {
+ String msg = "DynamoDB integration tests will be skipped. Did you specify AWS credentials for testing? "
+ + "See BaseIntegrationTest for more details";
+ if (service == null) {
+ System.out.println(msg);
+ }
+ assumeTrue(service != null, msg);
+ }
+
+ /**
+ * Asserts that iterable contains correct items and nothing else
+ *
+ */
+ private void assertIterableContainsItems(Iterable<HistoricItem> iterable, boolean ascending) {
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ HistoricItem actual2 = iterator.next();
+ assertFalse(iterator.hasNext());
+
+ for (HistoricItem actual : new HistoricItem[] { actual1, actual2 }) {
+ assertEquals(getItemName(), actual.getName());
+ }
+ HistoricItem storedFirst;
+ HistoricItem storedSecond;
+ if (ascending) {
+ storedFirst = actual1;
+ storedSecond = actual2;
+ } else {
+ storedFirst = actual2;
+ storedSecond = actual1;
+ }
+
+ assertStateEquals(getFirstItemState(), storedFirst.getState());
+ assertTrue(storedFirst.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
+ assertTrue(storedFirst.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
+
+ assertStateEquals(getSecondItemState(), storedSecond.getState());
+ assertTrue(storedSecond.getTimestamp().toInstant().isBefore(afterStore2.toInstant()));
+ assertTrue(storedSecond.getTimestamp().toInstant().isAfter(afterStore1.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingName() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStart() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertFalse(iterable.iterator().hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndEnd() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndEndNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(getItemName());
+ criteria.setEndDate(beforeStore);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertFalse(iterable.iterator().hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEnd() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, true);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndDesc() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.DESCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertIterableContainsItems(iterable, false);
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithNEQOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.NEQ);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithEQOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.EQ);
+ criteria.setState(getFirstItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithLTOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.LT);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithLTOperatorNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.LT);
+ criteria.setState(getFirstItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ assertFalse(iterator.hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithLTEOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.LTE);
+ criteria.setState(getFirstItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithGTOperator() {
+ // Skip for subclasses which have null "state between"
+ assumeTrue(getQueryItemStateBetween() != null);
+
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.GT);
+ criteria.setState(getQueryItemStateBetween());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getSecondItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore2.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(afterStore1.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithGTOperatorNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.GT);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ assertFalse(iterator.hasNext());
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndWithGTEOperator() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOperator(Operator.GTE);
+ criteria.setState(getSecondItemState());
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore2);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getSecondItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore2.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(afterStore1.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndFirst() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(afterStore1);
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+
+ Iterator<HistoricItem> iterator = iterable.iterator();
+ HistoricItem actual1 = iterator.next();
+ assertFalse(iterator.hasNext());
+ assertStateEquals(getFirstItemState(), actual1.getState());
+ assertTrue(actual1.getTimestamp().toInstant().isBefore(afterStore1.toInstant()));
+ assertTrue(actual1.getTimestamp().toInstant().isAfter(beforeStore.toInstant()));
+ }
+
+ @Test
+ public void testQueryUsingNameAndStartAndEndNoMatch() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(getItemName());
+ criteria.setBeginDate(beforeStore);
+ criteria.setEndDate(beforeStore); // sic
+ Iterable<HistoricItem> iterable = BaseIntegrationTest.service.query(criteria);
+ assertFalse(iterable.iterator().hasNext());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Map.Entry;
+import java.util.stream.Stream;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.common.registry.RegistryChangeListener;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemNotUniqueException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.items.RegistryHook;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.PlayerItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.amazonaws.services.dynamodbv2.model.ResourceNotFoundException;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class BaseIntegrationTest {
+ protected static final Logger LOGGER = LoggerFactory.getLogger(DynamoDBPersistenceService.class);
+ protected static @Nullable DynamoDBPersistenceService service;
+ protected static final Map<String, Item> ITEMS = new HashMap<>();
+
+ static {
+ System.setProperty("org.slf4j.simpleLogger.defaultLogLevel", "trace");
+ }
+
+ @BeforeAll
+ public static void initService() throws InterruptedException {
+ ITEMS.put("dimmer", new DimmerItem("dimmer"));
+ ITEMS.put("number", new NumberItem("number"));
+ ITEMS.put("string", new StringItem("string"));
+ ITEMS.put("switch", new SwitchItem("switch"));
+ ITEMS.put("contact", new ContactItem("contact"));
+ ITEMS.put("color", new ColorItem("color"));
+ ITEMS.put("rollershutter", new RollershutterItem("rollershutter"));
+ ITEMS.put("datetime", new DateTimeItem("datetime"));
+ ITEMS.put("call", new CallItem("call"));
+ ITEMS.put("location", new LocationItem("location"));
+ ITEMS.put("player_playpause", new PlayerItem("player_playpause"));
+ ITEMS.put("player_rewindfastforward", new PlayerItem("player_rewindfastforward"));
+
+ service = new DynamoDBPersistenceService(new ItemRegistry() {
+ @Override
+ public Collection<Item> getItems(String pattern) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Collection<Item> getItems() {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Item getItemByPattern(String name) throws ItemNotFoundException, ItemNotUniqueException {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Item getItem(String name) throws ItemNotFoundException {
+ Item item = ITEMS.get(name);
+ if (item == null) {
+ throw new ItemNotFoundException(name);
+ }
+ return item;
+ }
+
+ @Override
+ public void addRegistryChangeListener(RegistryChangeListener<Item> listener) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Collection<Item> getAll() {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Stream<Item> stream() {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public @Nullable Item get(String key) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public void removeRegistryChangeListener(RegistryChangeListener<Item> listener) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Item add(Item element) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public @Nullable Item update(Item element) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public @Nullable Item remove(String key) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Collection<Item> getItemsOfType(String type) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Collection<Item> getItemsByTag(String... tags) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public Collection<Item> getItemsByTagAndType(String type, String... tags) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public <T extends Item> Collection<T> getItemsByTag(Class<T> typeFilter, String... tags) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public @Nullable Item remove(String itemName, boolean recursive) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public void addRegistryHook(RegistryHook<Item> hook) {
+ throw new UnsupportedOperationException();
+ }
+
+ @Override
+ public void removeRegistryHook(RegistryHook<Item> hook) {
+ throw new UnsupportedOperationException();
+ }
+ });
+
+ Map<String, Object> config = new HashMap<>();
+ config.put("region", System.getProperty("DYNAMODBTEST_REGION"));
+ config.put("accessKey", System.getProperty("DYNAMODBTEST_ACCESS"));
+ config.put("secretKey", System.getProperty("DYNAMODBTEST_SECRET"));
+ config.put("tablePrefix", "dynamodb-integration-tests-");
+
+ // Disable buffering
+ config.put("bufferSize", "0");
+
+ for (Entry<String, Object> entry : config.entrySet()) {
+ if (entry.getValue() == null) {
+ LOGGER.warn(String.format(
+ "Expecting %s to have value for integration tests. Integration tests will be skipped",
+ entry.getKey()));
+ service = null;
+ return;
+ }
+ }
+
+ service.activate(null, config);
+ clearData();
+ }
+
+ protected static void clearData() {
+ // Clear data
+ for (String table : new String[] { "dynamodb-integration-tests-bigdecimal",
+ "dynamodb-integration-tests-string" }) {
+ try {
+ service.getDb().getDynamoClient().deleteTable(table);
+ service.getDb().getDynamoDB().getTable(table).waitForDelete();
+ } catch (ResourceNotFoundException e) {
+ } catch (InterruptedException e) {
+ LOGGER.warn("Interrupted! Table might not have been deleted");
+ }
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.assertEquals;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.CallItem;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class CallItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "call";
+ // values are encoded as part1,part2 - ordering goes wrt strings
+ private static final StringListType STATE1 = new StringListType("part1", "foo");
+ private static final StringListType STATE2 = new StringListType("part3", "bar");
+ private static final StringListType STATE_BETWEEN = new StringListType("part2", "zzz");
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ CallItem item = (CallItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+
+ @Override
+ protected void assertStateEquals(State expected, State actual) {
+ // Since CallType.equals is broken, toString is used as workaround
+ assertEquals(expected.toString(), actual.toString());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.math.BigDecimal;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class ColorItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static HSBType color(double hue, int saturation, int brightness) {
+ return new HSBType(new DecimalType(hue), new PercentType(saturation), new PercentType(brightness));
+ }
+
+ private static HSBType color(String hue, int saturation, int brightness) {
+ return new HSBType(new DecimalType(new BigDecimal(hue)), new PercentType(saturation),
+ new PercentType(brightness));
+ }
+
+ private static final String NAME = "color";
+ // values are encoded as <hue>,<saturation>,<brightness>, ordering goes wrt strings
+ private static final HSBType STATE1 = color("3.1493842988948932984298384892384823984923849238492839483294893", 50,
+ 50);
+ private static final HSBType STATE2 = color(75, 100, 90);
+ private static final HSBType STATE_BETWEEN = color(60, 50, 50);
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ ColorItem item = (ColorItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.types.State;
+
+/**
+ * @author Sami Salonen - Initial contribution
+ */
+@NonNullByDefault
+public class ContactItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "contact";
+ private static final OpenClosedType STATE1 = OpenClosedType.CLOSED;
+ private static final OpenClosedType STATE2 = OpenClosedType.OPEN;
+ // There is no OpenClosedType state value between CLOSED and OPEN.
+ // Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null.
+ private static final @Nullable OnOffType STATE_BETWEEN = null;
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ ContactItem item = (ContactItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class DateTimeItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "datetime";
+ private static final ZonedDateTime ZDT1 = ZonedDateTime.parse("2016-06-15T10:00:00Z");
+ private static final ZonedDateTime ZDT2 = ZonedDateTime.parse("2016-06-15T16:00:00.123Z");
+ private static final ZonedDateTime ZDT_BETWEEN = ZonedDateTime.parse("2016-06-15T14:00:00Z");
+
+ private static final DateTimeType STATE1 = new DateTimeType(ZDT1);
+ private static final DateTimeType STATE2 = new DateTimeType(ZDT2);
+ private static final DateTimeType STATE_BETWEEN = new DateTimeType(ZDT_BETWEEN);
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ DateTimeItem item = (DateTimeItem) ITEMS.get(NAME);
+
+ item.setState(STATE1);
+
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class DimmerItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "dimmer";
+ private static final PercentType STATE1 = new PercentType(66);
+ private static final PercentType STATE2 = new PercentType(68);
+ private static final PercentType STATE_BETWEEN = new PercentType(67);
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ DimmerItem item = (DimmerItem) ITEMS.get(NAME);
+
+ item.setState(STATE1);
+
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.*;
+
+import java.io.File;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+import java.nio.file.StandardOpenOption;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.io.TempDir;
+
+import com.amazonaws.regions.Regions;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class DynamoDBConfigTest {
+
+ private static Map<String, Object> mapFrom(String... args) {
+ assert args.length % 2 == 0;
+ Map<String, String> config = new HashMap<>();
+ for (int i = 1; i < args.length; i++) {
+ String key = args[i - 1];
+ String val = args[i];
+ config.put(key, val);
+ }
+ return Collections.unmodifiableMap(config);
+ }
+
+ public @TempDir @NonNullByDefault({}) File folder;
+
+ @Test
+ public void testEmpty() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(new HashMap<>()));
+ }
+
+ @Test
+ public void testInvalidRegion() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(Collections.singletonMap("region", "foobie")));
+ }
+
+ @Test
+ public void testRegionOnly() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(Collections.singletonMap("region", "eu-west-1")));
+ }
+
+ @Test
+ public void testRegionWithAccessKeys() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig
+ .fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithProfilesConfigFile() throws Exception {
+ Path credsFile = Files.createFile(Paths.get(folder.getPath(), "creds"));
+ Files.write(
+ credsFile, ("[fooprofile]\n" + "aws_access_key_id=testAccessKey\n"
+ + "aws_secret_access_key=testSecretKey\n" + "aws_session_token=testSessionToken\n").getBytes(),
+ StandardOpenOption.TRUNCATE_EXISTING);
+
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile",
+ credsFile.toAbsolutePath().toString(), "profile", "fooprofile"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testEmptyConfiguration() throws Exception {
+ assertNull(DynamoDBConfig.fromConfig(mapFrom()));
+ }
+
+ @Test
+ public void testRegionWithInvalidProfilesConfigFile() throws Exception {
+ Path credsFile = Files.createFile(Paths.get(folder.getPath(), "creds"));
+ Files.write(credsFile,
+ ("[fooprofile]\n" + "aws_access_key_idINVALIDKEY=testAccessKey\n"
+ + "aws_secret_access_key=testSecretKey\n" + "aws_session_token=testSessionToken\n").getBytes(),
+ StandardOpenOption.TRUNCATE_EXISTING);
+
+ assertNull(DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "profilesConfigFile",
+ credsFile.toFile().getAbsolutePath(), "profile", "fooprofile")));
+ }
+
+ @Test
+ public void testRegionWithProfilesConfigFileMissingProfile() throws Exception {
+ Path credsFile = Files.createFile(Paths.get(folder.getPath(), "creds"));
+ Files.write(
+ credsFile, ("[fooprofile]\n" + "aws_access_key_id=testAccessKey\n"
+ + "aws_secret_access_key=testSecretKey\n" + "aws_session_token=testSessionToken\n").getBytes(),
+ StandardOpenOption.TRUNCATE_EXISTING);
+
+ assertNull(DynamoDBConfig.fromConfig(
+ mapFrom("region", "eu-west-1", "profilesConfigFile", credsFile.toAbsolutePath().toString())));
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefix() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "tablePrefix", "foobie-"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("foobie-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithCreateTable() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(
+ mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "createTable", "false"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(false, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithReadCapacityUnits() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "readCapacityUnits", "5"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(5, fromConfig.getReadCapacityUnits());
+ assertEquals(1, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithWriteCapacityUnits() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "writeCapacityUnits", "5"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(1, fromConfig.getReadCapacityUnits());
+ assertEquals(5, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnits() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(mapFrom("region", "eu-west-1", "accessKey", "access1",
+ "secretKey", "secret1", "readCapacityUnits", "3", "writeCapacityUnits", "5"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(3, fromConfig.getReadCapacityUnits());
+ assertEquals(5, fromConfig.getWriteCapacityUnits());
+ assertEquals(1000L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(1000, fromConfig.getBufferSize());
+ }
+
+ @Test
+ public void testRegionWithAccessKeysWithPrefixWithReadWriteCapacityUnitsWithBufferSettings() throws Exception {
+ DynamoDBConfig fromConfig = DynamoDBConfig.fromConfig(
+ mapFrom("region", "eu-west-1", "accessKey", "access1", "secretKey", "secret1", "readCapacityUnits", "3",
+ "writeCapacityUnits", "5", "bufferCommitIntervalMillis", "501", "bufferSize", "112"));
+ assertEquals(Regions.EU_WEST_1, fromConfig.getRegion());
+ assertEquals("access1", fromConfig.getCredentials().getAWSAccessKeyId());
+ assertEquals("secret1", fromConfig.getCredentials().getAWSSecretKey());
+ assertEquals("openhab-", fromConfig.getTablePrefix());
+ assertEquals(true, fromConfig.isCreateTable());
+ assertEquals(3, fromConfig.getReadCapacityUnits());
+ assertEquals(5, fromConfig.getWriteCapacityUnits());
+ assertEquals(501L, fromConfig.getBufferCommitIntervalMillis());
+ assertEquals(112, fromConfig.getBufferSize());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.assertEquals;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.Test;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class DynamoDBTableNameResolverTest {
+
+ @Test
+ public void testWithDynamoDBBigDecimalItem() {
+ assertEquals("prefixbigdecimal",
+ new DynamoDBTableNameResolver("prefix").fromItem(new DynamoDBBigDecimalItem()));
+ }
+
+ @Test
+ public void testWithDynamoDBStringItem() {
+ assertEquals("prefixstring", new DynamoDBTableNameResolver("prefix").fromItem(new DynamoDBStringItem()));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class LocationItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "location";
+ // values are encoded as lat,lon[,alt] , ordering goes wrt strings
+ private static final PointType STATE1 = new PointType(
+ new DecimalType("60.012033100120453345435345345345346365434630300230230032020393149"), new DecimalType(30.),
+ new DecimalType(3.0));
+ private static final PointType STATE2 = new PointType(new DecimalType(61.0), new DecimalType(30.));
+ private static final PointType STATE_BETWEEN = new PointType(new DecimalType(60.5), new DecimalType(30.));
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ LocationItem item = (LocationItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.assertTrue;
+
+import java.math.BigDecimal;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class NumberItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "number";
+ // On purpose we have super accurate number here (testing limits of aws)
+ private static final DecimalType STATE1 = new DecimalType(new BigDecimal(
+ "-32343243.193490838904389298049802398048923849032809483209483209482309840239840932840932849083094809483"));
+ private static final DecimalType STATE2 = new DecimalType(600.9123);
+ private static final DecimalType STATE_BETWEEN = new DecimalType(500);
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ NumberItem item = (NumberItem) ITEMS.get(NAME);
+
+ item.setState(STATE1);
+
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+
+ /**
+ * Use relaxed state comparison due to numerical rounding. See also DynamoDBBigDecimalItem.loseDigits
+ */
+ @Override
+ protected void assertStateEquals(State expected, State actual) {
+ BigDecimal expectedDecimal = ((DecimalType) expected).toBigDecimal();
+ BigDecimal actualDecimal = ((DecimalType) actual).toBigDecimal();
+ assertTrue(DynamoDBBigDecimalItem.loseDigits(expectedDecimal).compareTo(actualDecimal) == 0);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.*;
+import static org.junit.jupiter.api.Assumptions.assumeTrue;
+
+import java.math.BigDecimal;
+import java.time.ZonedDateTime;
+import java.util.Arrays;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class PagingIntegrationTest extends BaseIntegrationTest {
+
+ private static final String NAME = "number";
+ private static final int STATE_COUNT = 10;
+
+ private static @Nullable ZonedDateTime storeStart;
+
+ @BeforeAll
+ public static void checkService() throws InterruptedException {
+ String msg = "DynamoDB integration tests will be skipped. Did you specify AWS credentials for testing? "
+ + "See BaseIntegrationTest for more details";
+ if (service == null) {
+ System.out.println(msg);
+ }
+ assumeTrue(service != null, msg);
+
+ populateData();
+ }
+
+ public static void populateData() {
+ storeStart = ZonedDateTime.now();
+
+ NumberItem item = (NumberItem) ITEMS.get(NAME);
+
+ for (int i = 0; i < STATE_COUNT; i++) {
+ item.setState(new DecimalType(i));
+ service.store(item);
+ }
+ }
+
+ @Test
+ public void testPagingFirstPage() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setPageNumber(0);
+ criteria.setPageSize(3);
+ assertItemStates(BaseIntegrationTest.service.query(criteria), 0, 1, 2);
+ }
+
+ @Test
+ public void testPagingSecondPage() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setPageNumber(1);
+ criteria.setPageSize(3);
+ assertItemStates(BaseIntegrationTest.service.query(criteria), 3, 4, 5);
+ }
+
+ @Test
+ public void testPagingPagePartialPage() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setPageNumber(3);
+ criteria.setPageSize(3);
+ assertItemStates(BaseIntegrationTest.service.query(criteria), 9);
+ }
+
+ @Test
+ public void testPagingPageOutOfBounds() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setPageNumber(4);
+ criteria.setPageSize(3);
+ assertItemStates(BaseIntegrationTest.service.query(criteria)); // no results
+ }
+
+ @Test
+ public void testPagingPage0Descending() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.DESCENDING);
+ criteria.setPageNumber(0);
+ criteria.setPageSize(3);
+ assertItemStates(BaseIntegrationTest.service.query(criteria), 9, 8, 7);
+ }
+
+ @Test
+ public void testPagingPage0HugePageSize() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setPageNumber(0);
+ criteria.setPageSize(900);
+ assertItemStates(BaseIntegrationTest.service.query(criteria), 0, 1, 2, 3, 4, 5, 6, 7, 8, 9);
+ }
+
+ @Test
+ public void testPagingFirstPageWithFilter() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(NAME);
+ criteria.setBeginDate(storeStart);
+ criteria.setOrdering(Ordering.ASCENDING);
+ criteria.setPageNumber(0);
+ criteria.setPageSize(3);
+ criteria.setOperator(Operator.GT);
+ criteria.setState(new DecimalType(new BigDecimal(3)));
+ assertItemStates(BaseIntegrationTest.service.query(criteria), 4, 5, 6);
+ }
+
+ private void assertItemStates(Iterable<HistoricItem> actualIterable, int... expected) {
+ Iterator<HistoricItem> actualIterator = actualIterable.iterator();
+ List<HistoricItem> got = new LinkedList<HistoricItem>();
+ for (int expectedState : expected) {
+ assertTrue(actualIterator.hasNext());
+ HistoricItem actual = actualIterator.next();
+ assertEquals(new DecimalType(expectedState), actual.getState());
+ got.add(actual);
+ }
+ if (actualIterator.hasNext()) {
+ fail("Did not expect any more items, but got at least this extra element: "
+ + actualIterator.next().toString() + ". Before this we got: " + Arrays.toString(got.toArray()));
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.PlayerItem;
+import org.openhab.core.library.types.PlayPauseType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class PlayerItemPlayPauseIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "player_playpause";
+ private static final PlayPauseType STATE1 = PlayPauseType.PAUSE;
+ private static final PlayPauseType STATE2 = PlayPauseType.PLAY;
+ private static final @Nullable PlayPauseType STATE_BETWEEN = null;
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ PlayerItem item = (PlayerItem) ITEMS.get(NAME);
+
+ item.setState(STATE1);
+
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.PlayerItem;
+import org.openhab.core.library.types.PlayPauseType;
+import org.openhab.core.library.types.RewindFastforwardType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class PlayerItemRewindFastForwardIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "player_rewindfastforward";
+ private static final RewindFastforwardType STATE1 = RewindFastforwardType.FASTFORWARD;
+ private static final RewindFastforwardType STATE2 = RewindFastforwardType.REWIND;
+ private static final @Nullable PlayPauseType STATE_BETWEEN = null;
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ PlayerItem item = (PlayerItem) ITEMS.get(NAME);
+
+ item.setState(STATE1);
+
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import static org.junit.jupiter.api.Assertions.assertTrue;
+
+import java.math.BigDecimal;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class RollershutterItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "rollershutter";
+ private static final PercentType STATE1 = PercentType.ZERO;
+ private static final PercentType STATE2 = new PercentType("72.938289428989489389329834898929892439842399483498");
+ private static final PercentType STATE_BETWEEN = new PercentType(66); // no such that exists
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ RollershutterItem item = (RollershutterItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+
+ /**
+ * Use relaxed state comparison due to numerical rounding. See also DynamoDBBigDecimalItem.loseDigits
+ */
+ @Override
+ protected void assertStateEquals(State expected, State actual) {
+ BigDecimal expectedDecimal = ((DecimalType) expected).toBigDecimal();
+ BigDecimal actualDecimal = ((DecimalType) actual).toBigDecimal();
+ assertTrue(DynamoDBBigDecimalItem.loseDigits(expectedDecimal).compareTo(actualDecimal) == 0);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class StringItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "string";
+ private static final StringType STATE1 = new StringType("b001");
+ private static final StringType STATE2 = new StringType("c002");
+ private static final StringType STATE_BETWEEN = new StringType("b001");
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ StringItem item = (StringItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.dynamodb.internal;
+
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.BeforeAll;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.types.State;
+
+/**
+ *
+ * @author Sami Salonen - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class SwitchItemIntegrationTest extends AbstractTwoItemIntegrationTest {
+
+ private static final String NAME = "switch";
+ private static final OnOffType STATE1 = OnOffType.OFF;
+ private static final OnOffType STATE2 = OnOffType.ON;
+ // There is no OnOffType state value between OFF and ON.
+ // Omit extended query tests AbstractTwoItemIntegrationTest by setting stateBetween to null.
+ private static final @Nullable OnOffType STATE_BETWEEN = null;
+
+ @BeforeAll
+ public static void storeData() throws InterruptedException {
+ SwitchItem item = (SwitchItem) ITEMS.get(NAME);
+ item.setState(STATE1);
+ beforeStore = ZonedDateTime.now();
+ Thread.sleep(10);
+ service.store(item);
+ afterStore1 = ZonedDateTime.now();
+ Thread.sleep(10);
+ item.setState(STATE2);
+ service.store(item);
+ Thread.sleep(10);
+ afterStore2 = ZonedDateTime.now();
+
+ LOGGER.info("Created item between {} and {}", AbstractDynamoDBItem.DATEFORMATTER.format(beforeStore),
+ AbstractDynamoDBItem.DATEFORMATTER.format(afterStore1));
+ }
+
+ @Override
+ protected String getItemName() {
+ return NAME;
+ }
+
+ @Override
+ protected State getFirstItemState() {
+ return STATE1;
+ }
+
+ @Override
+ protected State getSecondItemState() {
+ return STATE2;
+ }
+
+ @Override
+ protected @Nullable State getQueryItemStateBetween() {
+ return STATE_BETWEEN;
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.influxdb</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# InfluxDB (0.9 and newer) Persistence
+
+This service allows you to persist and query states using the [InfluxDB](https://www.influxdata.com/products/influxdb-overview/) and [InfluxDB 2.0](https://v2.docs.influxdata.com/v2.0/) time series database. The persisted values can be queried from within openHAB. There also are nice tools on the web for visualizing InfluxDB time series, such as [Grafana](http://grafana.org/).
+
+## Database Structure
+
+
+- This service allows you to persist and query states using the time series database.
+- The states of an item are persisted in *measurements* points with names equal to the name of the item, or the alias, if one is provided. In both variants, a *tag* named "item" is added, containing the item name.
+ All values are stored in a *field* called "value" using integers or doubles if possible,`OnOffType` and `OpenClosedType` values are stored using 0 or 1.
+- If configured extra tags for item category, label or type can be added fore each point.
+
+Some example entries for an item with the name "speedtest" without any further configuration would look like this:
+
+ > Query using Influx DB 2.0 syntax for 1.0 is different
+ > from(bucket: "default")
+ |> range(start: -30d)
+ |> filter(fn: (r) => r._measurement == "speedtest")
+ name: speedtest
+
+ _time _item _value
+ ----- ----- ------
+ 1558302027124000000 speedtest 123289369.0
+ 1558332852716000000 speedtest 80423789.0
+
+
+## Prerequisites
+
+First of all you have to setup and run an InfluxDB 1.X or 2.X server.
+This is very easy and you will find good documentation on it on the
+[InfluxDB web site for 2.X version](https://v2.docs.influxdata.com/v2.0/get-started/) and [InfluxDB web site for 1.X version](https://docs.influxdata.com/influxdb/v1.7/).
+
+## Configuration
+
+| Property | Default | Required | Description |
+|------------------------------------|-------------------------|----------|------------------------------------------|
+| version | V1 | No | InfluxDB database version V1 for 1.X and V2 for 2.x|
+| url | http://127.0.0.1:8086 | No | database URL |
+| user | openhab | No | name of the database user, e.g. `openhab`|
+| password | | No(*) | password of the database user you choose |
+| token | | No(*) | token to authenticate the database (only for V2) [Intructions about how to create one](https://v2.docs.influxdata.com/v2.0/security/tokens/create-token/) |
+| db | openhab | No | name of the database for V1 and name of the organization for V2 |
+| retentionPolicy | autogen | No | name of the retention policy for V1 and name of the bucket for V2 |
+
+(*) For 1.X version you must provide user and password, for 2.X you can use also user and password or a token. That means
+that if you use all default values at minimum you must provide a password or a token.
+
+All item- and event-related configuration is defined in the file `persistence/influxdb.persist`.
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.influxdb</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: InfluxDB</name>
+
+ <properties>
+ <bnd.importpackage>
+ !javax.annotation;!android.*,!com.android.*,!com.google.appengine.*,!dalvik.system,!kotlin.*,!kotlinx.*,!org.conscrypt,!sun.security.ssl,!org.apache.harmony.*,!org.apache.http.*,!rx.*,!org.msgpack.*
+ </bnd.importpackage>
+ </properties>
+
+ <dependencies>
+ <!-- START InfluxDB 2.0 -->
+ <!-- START influxdb-client-java -->
+ <dependency>
+ <groupId>com.influxdb</groupId>
+ <artifactId>influxdb-client-java</artifactId>
+ <version>1.6.0</version>
+ </dependency>
+ <dependency>
+ <artifactId>influxdb-client-core</artifactId>
+ <groupId>com.influxdb</groupId>
+ <version>1.6.0</version>
+ </dependency>
+ <dependency>
+ <artifactId>converter-gson</artifactId>
+ <groupId>com.squareup.retrofit2</groupId>
+ <version>2.5.0</version>
+ </dependency>
+ <dependency>
+ <artifactId>converter-scalars</artifactId>
+ <groupId>com.squareup.retrofit2</groupId>
+ <version>2.5.0</version>
+ </dependency>
+ <dependency>
+ <artifactId>gson</artifactId>
+ <groupId>com.google.code.gson</groupId>
+ <version>2.8.5</version>
+ </dependency>
+ <dependency>
+ <artifactId>gson-fire</artifactId>
+ <groupId>io.gsonfire</groupId>
+ <version>1.8.0</version>
+ </dependency>
+ <dependency>
+ <artifactId>okio</artifactId>
+ <groupId>com.squareup.okio</groupId>
+ <version>1.17.3</version>
+ </dependency>
+ <dependency>
+ <artifactId>commons-csv</artifactId>
+ <groupId>org.apache.commons</groupId>
+ <version>1.6</version>
+ </dependency>
+ <dependency>
+ <artifactId>json</artifactId>
+ <groupId>org.json</groupId>
+ <version>20180813</version>
+ </dependency>
+ <dependency>
+ <artifactId>okhttp</artifactId>
+ <groupId>com.squareup.okhttp3</groupId>
+ <version>3.14.4</version>
+ </dependency>
+ <dependency>
+ <artifactId>retrofit</artifactId>
+ <groupId>com.squareup.retrofit2</groupId>
+ <version>2.6.2</version>
+ </dependency>
+ <dependency>
+ <artifactId>jsr305</artifactId>
+ <groupId>com.google.code.findbugs</groupId>
+ <version>3.0.2</version>
+ </dependency>
+ <dependency>
+ <artifactId>logging-interceptor</artifactId>
+ <groupId>com.squareup.okhttp3</groupId>
+ <version>3.14.4</version>
+ </dependency>
+ <dependency>
+ <artifactId>rxjava</artifactId>
+ <groupId>io.reactivex.rxjava2</groupId>
+ <version>2.2.17</version>
+ </dependency>
+ <dependency>
+ <artifactId>reactive-streams</artifactId>
+ <groupId>org.reactivestreams</groupId>
+ <version>1.0.3</version>
+ </dependency>
+ <dependency>
+ <artifactId>swagger-annotations</artifactId>
+ <groupId>io.swagger</groupId>
+ <version>1.5.22</version>
+ </dependency>
+ <!--END influxdb-client-java -->
+
+
+ <dependency>
+ <groupId>com.influxdb</groupId>
+ <artifactId>flux-dsl</artifactId>
+ <version>1.6.0</version>
+ </dependency>
+
+ <!--END InfluxDB 2.0 -->
+
+ <!--START InfluxDB 1.0 -->
+ <dependency>
+ <groupId>org.influxdb</groupId>
+ <artifactId>influxdb-java</artifactId>
+ <version>2.17</version>
+ </dependency>
+ <dependency>
+ <groupId>com.squareup.retrofit2</groupId>
+ <artifactId>converter-moshi</artifactId>
+ <version>2.6.2</version>
+ </dependency>
+ <dependency>
+ <groupId>com.squareup.moshi</groupId>
+ <artifactId>moshi</artifactId>
+ <version>1.8.0</version>
+ </dependency>
+ <!-- Okhttp & Retrofit from 2.0 are ok -->
+
+ <!-- END InfluxDB 1.0 -->
+ </dependencies>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.influxdb-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>
+ mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features
+ </repository>
+
+ <feature name="openhab-persistence-influxdb" description="InfluxDB Persistence" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.influxdb/${project.version}
+ </bundle>
+ <configfile finalname="${openhab.conf}/services/influxdb.cfg" override="false">
+ mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/influxdb
+ </configfile>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb;
+
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.Collections;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.config.core.ConfigurableService;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.items.MetadataRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.persistence.influxdb.internal.FilterCriteriaQueryCreator;
+import org.openhab.persistence.influxdb.internal.InfluxDBConfiguration;
+import org.openhab.persistence.influxdb.internal.InfluxDBHistoricItem;
+import org.openhab.persistence.influxdb.internal.InfluxDBPersistentItemInfo;
+import org.openhab.persistence.influxdb.internal.InfluxDBRepository;
+import org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils;
+import org.openhab.persistence.influxdb.internal.InfluxPoint;
+import org.openhab.persistence.influxdb.internal.InfluxRow;
+import org.openhab.persistence.influxdb.internal.ItemToStorePointCreator;
+import org.openhab.persistence.influxdb.internal.RepositoryFactory;
+import org.osgi.framework.Constants;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Modified;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This is the implementation of the InfluxDB {@link PersistenceService}. It persists item values
+ * using the <a href="http://influxdb.org">InfluxDB time series database. The states (
+ * {@link State}) of an {@link Item} are persisted by default in a time series with names equal to the name of
+ * the item.
+ *
+ * This addon supports 1.X and 2.X versions, as two versions are incompatible and use different drivers the
+ * specific code for each version is accessed by {@link InfluxDBRepository} and {@link FilterCriteriaQueryCreator}
+ * interfaces and specific implementation reside in {@link org.openhab.persistence.influxdb.internal.influx1} and
+ * {@link org.openhab.persistence.influxdb.internal.influx2} packages
+ *
+ * @author Theo Weiss - Initial contribution, rewrite of org.openhab.persistence.influxdb
+ * @author Joan Pujol Espinar - Addon rewrite refactoring code and adding support for InfluxDB 2.0. Some tag code is
+ * based
+ * from not integrated branch from Dominik Vorreiter
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.influxdb", //
+ property = Constants.SERVICE_PID + "=org.openhab.influxdb")
+@ConfigurableService(category = "persistence", label = "InfluxDB Persistence Service", description_uri = InfluxDBPersistenceService.CONFIG_URI)
+public class InfluxDBPersistenceService implements QueryablePersistenceService {
+ public static final String SERVICE_NAME = "influxdb";
+
+ private final Logger logger = LoggerFactory.getLogger(InfluxDBPersistenceService.class);
+
+ protected static final String CONFIG_URI = "persistence:influxdb";
+
+ // External dependencies
+ private final ItemRegistry itemRegistry;
+ private final MetadataRegistry metadataRegistry;
+
+ // Internal dependencies/state
+ private InfluxDBConfiguration configuration = InfluxDBConfiguration.NO_CONFIGURATION;
+
+ // Relax rules because can only be null if component is not active
+ private @NonNullByDefault({}) ItemToStorePointCreator itemToStorePointCreator;
+ private @NonNullByDefault({}) InfluxDBRepository influxDBRepository;
+
+ @Activate
+ public InfluxDBPersistenceService(final @Reference ItemRegistry itemRegistry,
+ final @Reference MetadataRegistry metadataRegistry) {
+ this.itemRegistry = itemRegistry;
+ this.metadataRegistry = metadataRegistry;
+ }
+
+ /**
+ * Connect to database when service is activated
+ */
+ @Activate
+ public void activate(final @Nullable Map<String, @Nullable Object> config) {
+ logger.debug("InfluxDB persistence service is being activated");
+
+ if (loadConfiguration(config)) {
+ itemToStorePointCreator = new ItemToStorePointCreator(configuration, metadataRegistry);
+ influxDBRepository = createInfluxDBRepository();
+ influxDBRepository.connect();
+ } else {
+ logger.error("Cannot load configuration, persistence service wont work");
+ }
+
+ logger.debug("InfluxDB persistence service is now activated");
+ }
+
+ // Visible for testing
+ protected InfluxDBRepository createInfluxDBRepository() {
+ return RepositoryFactory.createRepository(configuration);
+ }
+
+ /**
+ * Disconnect from database when service is deactivated
+ */
+ @Deactivate
+ public void deactivate() {
+ logger.debug("InfluxDB persistence service deactivated");
+ if (influxDBRepository != null) {
+ influxDBRepository.disconnect();
+ influxDBRepository = null;
+ }
+ if (itemToStorePointCreator != null) {
+ itemToStorePointCreator = null;
+ }
+ }
+
+ /**
+ * Rerun deactivation/activation code each time configuration is changed
+ */
+ @Modified
+ protected void modified(@Nullable Map<String, @Nullable Object> config) {
+ if (config != null) {
+ logger.debug("Config has been modified will deactivate/activate with new config");
+
+ deactivate();
+ activate(config);
+ } else {
+ logger.warn("Null configuration, ignoring");
+ }
+ }
+
+ private boolean loadConfiguration(@Nullable Map<String, @Nullable Object> config) {
+ boolean configurationIsValid;
+ if (config != null) {
+ configuration = new InfluxDBConfiguration(config);
+ configurationIsValid = configuration.isValid();
+ if (configurationIsValid) {
+ logger.debug("Loaded configuration {}", config);
+ } else {
+ logger.warn("Some configuration properties are not valid {}", config);
+ }
+ } else {
+ configuration = InfluxDBConfiguration.NO_CONFIGURATION;
+ configurationIsValid = false;
+ logger.warn("Ignoring configuration because it's null");
+ }
+ return configurationIsValid;
+ }
+
+ @Override
+ public String getId() {
+ return SERVICE_NAME;
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "InfluxDB persistence layer";
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ if (influxDBRepository != null && influxDBRepository.isConnected()) {
+ return influxDBRepository.getStoredItemsCount().entrySet().stream()
+ .map(entry -> new InfluxDBPersistentItemInfo(entry.getKey(), entry.getValue()))
+ .collect(Collectors.toUnmodifiableSet());
+ } else {
+ logger.info("getItemInfo ignored, InfluxDB is not yet connected");
+ return Collections.emptySet();
+ }
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, item.getName());
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ if (influxDBRepository != null && influxDBRepository.isConnected()) {
+ InfluxPoint point = itemToStorePointCreator.convert(item, alias);
+ if (point != null) {
+ logger.trace("Storing item {} in InfluxDB point {}", item, point);
+ influxDBRepository.write(point);
+ } else {
+ logger.trace("Ignoring item {} as is cannot be converted to a InfluxDB point", item);
+ }
+ } else {
+ logger.debug("store ignored, InfluxDB is not yet connected");
+ }
+ }
+
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ logger.debug("Got a query for historic points!");
+
+ if (influxDBRepository != null && influxDBRepository.isConnected()) {
+ logger.trace(
+ "Filter: itemname: {}, ordering: {}, state: {}, operator: {}, getBeginDate: {}, getEndDate: {}, getPageSize: {}, getPageNumber: {}",
+ filter.getItemName(), filter.getOrdering().toString(), filter.getState(), filter.getOperator(),
+ filter.getBeginDate(), filter.getEndDate(), filter.getPageSize(), filter.getPageNumber());
+
+ String query = RepositoryFactory.createQueryCreator(configuration).createQuery(filter,
+ configuration.getRetentionPolicy());
+ logger.trace("Query {}", query);
+ List<InfluxRow> results = influxDBRepository.query(query);
+ return results.stream().map(this::mapRow2HistoricItem).collect(Collectors.toList());
+ } else {
+ logger.debug("query ignored, InfluxDB is not yet connected");
+ return Collections.emptyList();
+ }
+ }
+
+ private HistoricItem mapRow2HistoricItem(InfluxRow row) {
+ State state = InfluxDBStateConvertUtils.objectToState(row.getValue(), row.getItemName(), itemRegistry);
+ return new InfluxDBHistoricItem(row.getItemName(), state,
+ ZonedDateTime.ofInstant(row.getTime(), ZoneId.systemDefault()));
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.FilterCriteria;
+
+/**
+ * Creates InfluxDB query sentence given a OpenHab persistence {@link FilterCriteria}
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public interface FilterCriteriaQueryCreator {
+ /**
+ * Create query from {@link FilterCriteria}
+ *
+ * @param criteria Criteria to create query from
+ * @param retentionPolicy Name of the retentionPolicy/bucket to use in query
+ * @return Created query as an String
+ */
+ String createQuery(FilterCriteria criteria, String retentionPolicy);
+
+ default String getOperationSymbol(FilterCriteria.Operator operator, InfluxDBVersion version) {
+ switch (operator) {
+ case EQ:
+ return "=";
+ case LT:
+ return "<";
+ case LTE:
+ return "<=";
+ case GT:
+ return ">";
+ case GTE:
+ return ">=";
+ case NEQ:
+ return version == InfluxDBVersion.V1 ? "<>" : "!=";
+ default:
+ throw new UnnexpectedConditionException("Not expected operator " + operator);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.util.Collections;
+import java.util.Map;
+import java.util.StringJoiner;
+
+import org.eclipse.jdt.annotation.NonNull;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Contains this addon configurable parameters
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class InfluxDBConfiguration {
+ public static final String URL_PARAM = "url";
+ public static final String TOKEN_PARAM = "token";
+ public static final String USER_PARAM = "user";
+ public static final String PASSWORD_PARAM = "password";
+ public static final String DATABASE_PARAM = "db";
+ public static final String RETENTION_POLICY_PARAM = "retentionPolicy";
+ public static final String VERSION_PARAM = "version";
+ public static final String REPLACE_UNDERSCORE_PARAM = "replaceUnderscore";
+ public static final String ADD_CATEGORY_TAG_PARAM = "addCategoryTag";
+ public static final String ADD_LABEL_TAG_PARAM = "addLabelTag";
+ public static final String ADD_TYPE_TAG_PARAM = "addTypeTag";
+ public static InfluxDBConfiguration NO_CONFIGURATION = new InfluxDBConfiguration(Collections.emptyMap());
+ private final Logger logger = LoggerFactory.getLogger(InfluxDBConfiguration.class);
+ private final String url;
+ private final String user;
+ private final String password;
+ private final String token;
+ private final String databaseName;
+ private final String retentionPolicy;
+ private final InfluxDBVersion version;
+
+ private final boolean replaceUnderscore;
+ private final boolean addCategoryTag;
+ private final boolean addTypeTag;
+ private final boolean addLabelTag;
+
+ public InfluxDBConfiguration(Map<String, @Nullable Object> config) {
+ url = (@NonNull String) config.getOrDefault(URL_PARAM, "http://127.0.0.1:8086");
+ user = (@NonNull String) config.getOrDefault(USER_PARAM, "openhab");
+ password = (@NonNull String) config.getOrDefault(PASSWORD_PARAM, "");
+ token = (@NonNull String) config.getOrDefault(TOKEN_PARAM, "");
+ databaseName = (@NonNull String) config.getOrDefault(DATABASE_PARAM, "openhab");
+ retentionPolicy = (@NonNull String) config.getOrDefault(RETENTION_POLICY_PARAM, "autogen");
+ version = parseInfluxVersion(config.getOrDefault(VERSION_PARAM, InfluxDBVersion.V1.name()));
+
+ replaceUnderscore = getConfigBooleanValue(config, REPLACE_UNDERSCORE_PARAM, false);
+ addCategoryTag = getConfigBooleanValue(config, ADD_CATEGORY_TAG_PARAM, false);
+ addLabelTag = getConfigBooleanValue(config, ADD_LABEL_TAG_PARAM, false);
+ addTypeTag = getConfigBooleanValue(config, ADD_TYPE_TAG_PARAM, false);
+ }
+
+ private static boolean getConfigBooleanValue(Map<String, @Nullable Object> config, String key,
+ boolean defaultValue) {
+ Object object = config.get(key);
+ if (object instanceof Boolean) {
+ return (Boolean) object;
+ } else if (object instanceof String) {
+ return "true".equalsIgnoreCase((String) object);
+ } else {
+ return defaultValue;
+ }
+ }
+
+ private InfluxDBVersion parseInfluxVersion(@Nullable Object value) {
+ try {
+ return InfluxDBVersion.valueOf((String) value);
+ } catch (RuntimeException e) {
+ logger.warn("Invalid version {}", value);
+ return InfluxDBVersion.UNKNOWN;
+ }
+ }
+
+ public boolean isValid() {
+ boolean hasVersion = version != InfluxDBVersion.UNKNOWN;
+ boolean hasCredentials = false;
+ if (version == InfluxDBVersion.V1) {
+ hasCredentials = !user.isBlank() && !password.isBlank();
+ } else if (version == InfluxDBVersion.V2) {
+ hasCredentials = !token.isBlank() || (!user.isBlank() && !password.isBlank());
+ }
+ boolean hasDatabase = !databaseName.isBlank();
+ boolean hasRetentionPolicy = !retentionPolicy.isBlank();
+
+ boolean valid = hasVersion && hasCredentials && hasDatabase && hasRetentionPolicy;
+ if (valid) {
+ return true;
+ } else {
+ String msg = "InfluxDB configuration isn't valid. Addon won't work: ";
+ StringJoiner reason = new StringJoiner(",");
+ if (!hasVersion) {
+ reason.add("Unknown version");
+ } else {
+ if (!hasCredentials) {
+ reason.add("No credentials");
+ }
+ if (!hasDatabase) {
+ reason.add("No database name / organization defined");
+ }
+ if (!hasRetentionPolicy) {
+ reason.add("No retention policy / bucket defined");
+ }
+ }
+ logger.warn("{} {}", msg, reason);
+ return false;
+ }
+ }
+
+ public String getUrl() {
+ return url;
+ }
+
+ public String getToken() {
+ return token;
+ }
+
+ public String getDatabaseName() {
+ return databaseName;
+ }
+
+ public String getRetentionPolicy() {
+ return retentionPolicy;
+ }
+
+ public boolean isReplaceUnderscore() {
+ return replaceUnderscore;
+ }
+
+ public boolean isAddCategoryTag() {
+ return addCategoryTag;
+ }
+
+ public boolean isAddTypeTag() {
+ return addTypeTag;
+ }
+
+ public boolean isAddLabelTag() {
+ return addLabelTag;
+ }
+
+ public String getUser() {
+ return user;
+ }
+
+ public String getPassword() {
+ return password;
+ }
+
+ public InfluxDBVersion getVersion() {
+ return version;
+ }
+
+ @Override
+ public String toString() {
+ String sb = "InfluxDBConfiguration{" + "url='" + url + '\'' + ", user='" + user + '\'' + ", password='"
+ + password.length() + " chars" + '\'' + ", token='" + token.length() + " chars" + '\''
+ + ", databaseName='" + databaseName + '\'' + ", retentionPolicy='" + retentionPolicy + '\''
+ + ", version=" + version + ", replaceUnderscore=" + replaceUnderscore + ", addCategoryTag="
+ + addCategoryTag + ", addTypeTag=" + addTypeTag + ", addLabelTag=" + addLabelTag + '}';
+ return sb;
+ }
+
+ public int getTokenLength() {
+ return token.length();
+ }
+
+ public char[] getTokenAsCharArray() {
+ return token.toCharArray();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Constants used by this addon
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class InfluxDBConstants {
+ public static final String COLUMN_VALUE_NAME_V1 = "value";
+ public static final String COLUMN_VALUE_NAME_V2 = "_value";
+
+ public static final String COLUMN_TIME_NAME_V1 = "time";
+ public static final String COLUMN_TIME_NAME_V2 = "_time";
+
+ public static final String FIELD_VALUE_NAME = "value";
+ public static final String TAG_ITEM_NAME = "item";
+ public static final String TAG_CATEGORY_NAME = "category";
+ public static final String TAG_TYPE_NAME = "type";
+ public static final String TAG_LABEL_NAME = "label";
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.text.DateFormat;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+
+/**
+ * Java bean used to return items queries results from InfluxDB.
+ *
+ * @author Theo Weiss - Initial Contribution
+ * @author Joan Pujol Espinar - Addon rewrite refactoring code and adding support for InfluxDB 2.0.
+ */
+@NonNullByDefault
+public class InfluxDBHistoricItem implements HistoricItem {
+
+ private String name = "";
+ private State state = UnDefType.NULL;
+ private ZonedDateTime timestamp;
+
+ public InfluxDBHistoricItem(String name, State state, ZonedDateTime timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ public void setState(State state) {
+ this.state = state;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return timestamp;
+ }
+
+ public void setTimestamp(ZonedDateTime timestamp) {
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.persistence.PersistenceItemInfo;
+
+/**
+ * Java bean used to return information about stored items
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class InfluxDBPersistentItemInfo implements PersistenceItemInfo {
+ private final String name;
+ private final Integer count;
+
+ public InfluxDBPersistentItemInfo(String name, Integer count) {
+ this.name = name;
+ this.count = count;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ @Nullable
+ public Integer getCount() {
+ return count;
+ }
+
+ @Override
+ @Nullable
+ public Date getEarliest() {
+ return null;
+ }
+
+ @Override
+ @Nullable
+ public Date getLatest() {
+ return null;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.util.List;
+import java.util.Map;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Manages InfluxDB server interaction maintaining client connection
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public interface InfluxDBRepository {
+ /**
+ * Returns if the client is successfully connected to server
+ *
+ * @return True if it's connected, otherwise false
+ */
+ boolean isConnected();
+
+ /**
+ * Connect to InfluxDB server
+ *
+ * @return True if successful, otherwise false
+ */
+ boolean connect();
+
+ /**
+ * Disconnect from InfluxDB server
+ */
+ void disconnect();
+
+ /**
+ * Check if connection is currently ready
+ *
+ * @return True if its ready, otherwise false
+ */
+ boolean checkConnectionStatus();
+
+ /**
+ * Return all stored item names with it's count of stored points
+ *
+ * @return Map with <ItemName,ItemCount> entries
+ */
+ Map<String, Integer> getStoredItemsCount();
+
+ /**
+ * Executes Flux query
+ *
+ * @param query Query
+ * @return Query results
+ */
+ List<InfluxRow> query(String query);
+
+ /**
+ * Write point to database
+ *
+ * @param influxPoint Point to write
+ */
+ void write(InfluxPoint influxPoint);
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.math.BigDecimal;
+import java.time.Instant;
+import java.time.ZonedDateTime;
+import java.util.TimeZone;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.QuantityType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.types.State;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Conversion logic between openHAB {@link State} types and InfluxDB store types
+ *
+ * @author Joan Pujol Espinar - Initial contribution, based on previous work from Theo Weiss and Dominik Vorreiter
+ */
+@NonNullByDefault
+public class InfluxDBStateConvertUtils {
+ static final Number DIGITAL_VALUE_OFF = 0; // Visible for testing
+ static final Number DIGITAL_VALUE_ON = 1; // Visible for testing
+ private static Logger logger = LoggerFactory.getLogger(InfluxDBStateConvertUtils.class);
+
+ /**
+ * Converts {@link State} to objects fitting into influxdb values.
+ *
+ * @param state to be converted
+ * @return integer or double value for DecimalType, 0 or 1 for OnOffType and OpenClosedType,
+ * integer for DateTimeType, String for all others
+ */
+ public static Object stateToObject(State state) {
+ Object value;
+ if (state instanceof HSBType) {
+ value = state.toString();
+ } else if (state instanceof PointType) {
+ value = point2String((PointType) state);
+ } else if (state instanceof DecimalType) {
+ value = convertBigDecimalToNum(((DecimalType) state).toBigDecimal());
+ } else if (state instanceof QuantityType<?>) {
+ value = convertBigDecimalToNum(((QuantityType<?>) state).toBigDecimal());
+ } else if (state instanceof OnOffType) {
+ value = state == OnOffType.ON ? DIGITAL_VALUE_ON : DIGITAL_VALUE_OFF;
+ } else if (state instanceof OpenClosedType) {
+ value = state == OpenClosedType.OPEN ? DIGITAL_VALUE_ON : DIGITAL_VALUE_OFF;
+ } else if (state instanceof DateTimeType) {
+ value = ((DateTimeType) state).getZonedDateTime().toInstant().toEpochMilli();
+ } else {
+ value = state.toString();
+ }
+ return value;
+ }
+
+ /**
+ * Converts a value to a {@link State} which is suitable for the given {@link Item}. This is
+ * needed for querying a {@link InfluxDBHistoricItem}.
+ *
+ * @param value to be converted to a {@link State}
+ * @param itemName name of the {@link Item} to get the {@link State} for
+ * @return the state of the item represented by the itemName parameter, else the string value of
+ * the Object parameter
+ */
+ public static State objectToState(Object value, String itemName, @Nullable ItemRegistry itemRegistry) {
+ State state = null;
+ if (itemRegistry != null) {
+ try {
+ Item item = itemRegistry.getItem(itemName);
+ state = objectToState(value, item);
+ } catch (ItemNotFoundException e) {
+ logger.info("Could not find item '{}' in registry", itemName);
+ }
+ }
+
+ if (state == null) {
+ state = new StringType(String.valueOf(value));
+ }
+
+ return state;
+ }
+
+ public static State objectToState(Object value, Item itemToSetState) {
+ String valueStr = String.valueOf(value);
+
+ Item item = itemToSetState;
+ if (item instanceof GroupItem) {
+ item = ((GroupItem) item).getBaseItem();
+ }
+ if (item instanceof ColorItem) {
+ return new HSBType(valueStr);
+ } else if (item instanceof LocationItem) {
+ return new PointType(valueStr);
+ } else if (item instanceof NumberItem) {
+ return new DecimalType(valueStr);
+ } else if (item instanceof DimmerItem) {
+ return new PercentType(valueStr);
+ } else if (item instanceof SwitchItem) {
+ return toBoolean(valueStr) ? OnOffType.ON : OnOffType.OFF;
+ } else if (item instanceof ContactItem) {
+ return toBoolean(valueStr) ? OpenClosedType.OPEN : OpenClosedType.CLOSED;
+ } else if (item instanceof RollershutterItem) {
+ return new PercentType(valueStr);
+ } else if (item instanceof DateTimeItem) {
+ Instant i = Instant.ofEpochMilli(new BigDecimal(valueStr).longValue());
+ ZonedDateTime z = ZonedDateTime.ofInstant(i, TimeZone.getDefault().toZoneId());
+ return new DateTimeType(z);
+ } else {
+ return new StringType(valueStr);
+ }
+ }
+
+ private static boolean toBoolean(@Nullable Object object) {
+ if (object instanceof Boolean) {
+ return (Boolean) object;
+ } else if (object != null) {
+ if ("1".equals(object)) {
+ return true;
+ } else {
+ return Boolean.valueOf(String.valueOf(object));
+ }
+ } else {
+ return false;
+ }
+ }
+
+ private static String point2String(PointType point) {
+ StringBuilder buf = new StringBuilder();
+ buf.append(point.getLatitude().toString());
+ buf.append(",");
+ buf.append(point.getLongitude().toString());
+ if (!point.getAltitude().equals(DecimalType.ZERO)) {
+ buf.append(",");
+ buf.append(point.getAltitude().toString());
+ }
+ return buf.toString(); // latitude, longitude, altitude
+ }
+
+ /**
+ * This method returns an integer if possible if not a double is returned. This is an optimization
+ * for influxdb because integers have less overhead.
+ *
+ * @param value the BigDecimal to be converted
+ * @return A double if possible else a double is returned.
+ */
+ private static Object convertBigDecimalToNum(BigDecimal value) {
+ Object convertedValue;
+ if (value.scale() == 0) {
+ convertedValue = value.toBigInteger();
+ } else {
+ convertedValue = value.doubleValue();
+ }
+ return convertedValue;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+/**
+ * InfluxDB version
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+public enum InfluxDBVersion {
+ V1,
+ V2,
+ UNKNOWN
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.time.Instant;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.eclipse.jdt.annotation.DefaultLocation;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Point data to be stored in InfluxDB
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault({ DefaultLocation.PARAMETER })
+public class InfluxPoint {
+ private String measurementName;
+ private Instant time;
+ private Object value;
+ private Map<String, String> tags;
+
+ private InfluxPoint(Builder builder) {
+ measurementName = builder.measurementName;
+ time = builder.time;
+ value = builder.value;
+ tags = builder.tags;
+ }
+
+ public static Builder newBuilder(String measurementName) {
+ return new Builder(measurementName);
+ }
+
+ public String getMeasurementName() {
+ return measurementName;
+ }
+
+ public Instant getTime() {
+ return time;
+ }
+
+ public Object getValue() {
+ return value;
+ }
+
+ public Map<String, String> getTags() {
+ return Collections.unmodifiableMap(tags);
+ }
+
+ public static final class Builder {
+ private String measurementName;
+ private Instant time;
+ private Object value;
+ private Map<String, String> tags = new HashMap<>();
+
+ private Builder(String measurementName) {
+ this.measurementName = measurementName;
+ }
+
+ public Builder withTime(Instant val) {
+ time = val;
+ return this;
+ }
+
+ public Builder withValue(Object val) {
+ value = val;
+ return this;
+ }
+
+ public Builder withTag(String name, String value) {
+ tags.put(name, value);
+ return this;
+ }
+
+ public InfluxPoint build() {
+ return new InfluxPoint(this);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import java.time.Instant;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Row data returned from database query
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class InfluxRow {
+ private final String itemName;
+ private final Instant time;
+ private final Object value;
+
+ public InfluxRow(Instant time, String itemName, Object value) {
+ this.time = time;
+ this.itemName = itemName;
+ this.value = value;
+ }
+
+ public Instant getTime() {
+ return time;
+ }
+
+ public String getItemName() {
+ return itemName;
+ }
+
+ public Object getValue() {
+ return value;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.*;
+
+import java.time.Instant;
+import java.util.Optional;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.Metadata;
+import org.openhab.core.items.MetadataKey;
+import org.openhab.core.items.MetadataRegistry;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.openhab.persistence.influxdb.InfluxDBPersistenceService;
+
+/**
+ * Logic to create an InfluxDB {@link InfluxPoint} from an openHAB {@link Item}
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class ItemToStorePointCreator {
+ private final InfluxDBConfiguration configuration;
+ private final @Nullable MetadataRegistry metadataRegistry;
+
+ public ItemToStorePointCreator(InfluxDBConfiguration configuration, @Nullable MetadataRegistry metadataRegistry) {
+ this.configuration = configuration;
+ this.metadataRegistry = metadataRegistry;
+ }
+
+ public @Nullable InfluxPoint convert(Item item, @Nullable String storeAlias) {
+ if (item.getState() instanceof UnDefType) {
+ return null;
+ }
+
+ String measurementName = calculateMeasurementName(item, storeAlias);
+ String itemName = item.getName();
+ State state = getItemState(item);
+
+ Object value = InfluxDBStateConvertUtils.stateToObject(state);
+
+ InfluxPoint.Builder point = InfluxPoint.newBuilder(measurementName).withTime(Instant.now()).withValue(value)
+ .withTag(TAG_ITEM_NAME, itemName);
+
+ addPointTags(item, point);
+
+ return point.build();
+ }
+
+ private String calculateMeasurementName(Item item, @Nullable String storeAlias) {
+ String name = storeAlias != null && !storeAlias.isBlank() ? storeAlias : item.getName();
+
+ if (configuration.isReplaceUnderscore()) {
+ name = name.replace('_', '.');
+ }
+
+ return name;
+ }
+
+ private State getItemState(Item item) {
+ final State state;
+ final Optional<Class<? extends State>> desiredConversion = calculateDesiredTypeConversionToStore(item);
+ if (desiredConversion.isPresent()) {
+ State convertedState = item.getStateAs(desiredConversion.get());
+ if (convertedState != null) {
+ state = convertedState;
+ } else {
+ state = item.getState();
+ }
+ } else {
+ state = item.getState();
+ }
+ return state;
+ }
+
+ private Optional<Class<? extends State>> calculateDesiredTypeConversionToStore(Item item) {
+ return item.getAcceptedCommandTypes().stream().filter(commandType -> commandType.isAssignableFrom(State.class))
+ .findFirst().map(commandType -> commandType.asSubclass(State.class));
+ }
+
+ private void addPointTags(Item item, InfluxPoint.Builder point) {
+ if (configuration.isAddCategoryTag()) {
+ String categoryName = item.getCategory();
+ if (categoryName == null) {
+ categoryName = "n/a";
+ }
+ point.withTag(TAG_CATEGORY_NAME, categoryName);
+ }
+
+ if (configuration.isAddTypeTag()) {
+ point.withTag(TAG_TYPE_NAME, item.getType());
+ }
+
+ if (configuration.isAddLabelTag()) {
+ String labelName = item.getLabel();
+ if (labelName == null) {
+ labelName = "n/a";
+ }
+ point.withTag(TAG_LABEL_NAME, labelName);
+ }
+
+ final MetadataRegistry currentMetadataRegistry = metadataRegistry;
+ if (currentMetadataRegistry != null) {
+ MetadataKey key = new MetadataKey(InfluxDBPersistenceService.SERVICE_NAME, item.getName());
+ Metadata metadata = currentMetadataRegistry.get(key);
+ if (metadata != null) {
+ metadata.getConfiguration().forEach((tagName, tagValue) -> {
+ point.withTag(tagName, tagValue.toString());
+ });
+ }
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.persistence.influxdb.internal.influx1.Influx1FilterCriteriaQueryCreatorImpl;
+import org.openhab.persistence.influxdb.internal.influx1.InfluxDB1RepositoryImpl;
+import org.openhab.persistence.influxdb.internal.influx2.Influx2FilterCriteriaQueryCreatorImpl;
+import org.openhab.persistence.influxdb.internal.influx2.InfluxDB2RepositoryImpl;
+
+/**
+ * Factory that returns {@link InfluxDBRepository} and {@link FilterCriteriaQueryCreator} implementations
+ * depending on InfluxDB version
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class RepositoryFactory {
+
+ public static InfluxDBRepository createRepository(InfluxDBConfiguration influxDBConfiguration) {
+ switch (influxDBConfiguration.getVersion()) {
+ case V1:
+ return new InfluxDB1RepositoryImpl(influxDBConfiguration);
+ case V2:
+ return new InfluxDB2RepositoryImpl(influxDBConfiguration);
+ default:
+ throw new UnnexpectedConditionException("Not expected version " + influxDBConfiguration.getVersion());
+ }
+ }
+
+ public static FilterCriteriaQueryCreator createQueryCreator(InfluxDBConfiguration influxDBConfiguration) {
+ switch (influxDBConfiguration.getVersion()) {
+ case V1:
+ return new Influx1FilterCriteriaQueryCreatorImpl();
+ case V2:
+ return new Influx2FilterCriteriaQueryCreatorImpl();
+ default:
+ throw new UnnexpectedConditionException("Not expected version " + influxDBConfiguration.getVersion());
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Throw to indicate an unnexpected condition that should not have happened (a bug)
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class UnnexpectedConditionException extends RuntimeException {
+ private static final long serialVersionUID = 1128380327167959556L;
+
+ public UnnexpectedConditionException(String message) {
+ super(message);
+ }
+
+ public UnnexpectedConditionException(String message, Throwable cause) {
+ super(message, cause);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal.influx1;
+
+import static org.influxdb.querybuilder.BuiltQuery.QueryBuilder.*;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.*;
+import static org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.stateToObject;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.influxdb.dto.Query;
+import org.influxdb.querybuilder.Appender;
+import org.influxdb.querybuilder.BuiltQuery;
+import org.influxdb.querybuilder.Select;
+import org.influxdb.querybuilder.Where;
+import org.influxdb.querybuilder.clauses.SimpleClause;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.persistence.influxdb.internal.FilterCriteriaQueryCreator;
+import org.openhab.persistence.influxdb.internal.InfluxDBVersion;
+
+/**
+ * Implementation of {@link FilterCriteriaQueryCreator} for InfluxDB 1.0
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class Influx1FilterCriteriaQueryCreatorImpl implements FilterCriteriaQueryCreator {
+
+ @Override
+ public String createQuery(FilterCriteria criteria, String retentionPolicy) {
+ final String tableName;
+ boolean hasCriteriaName = criteria.getItemName() != null;
+ if (hasCriteriaName) {
+ tableName = criteria.getItemName();
+ } else {
+ tableName = "/.*/";
+ }
+
+ Select select = select(COLUMN_VALUE_NAME_V1).fromRaw(null,
+ fullQualifiedTableName(retentionPolicy, tableName, hasCriteriaName));
+
+ Where where = select.where();
+ if (criteria.getBeginDate() != null) {
+ where = where.and(
+ BuiltQuery.QueryBuilder.gte(COLUMN_TIME_NAME_V1, criteria.getBeginDate().toInstant().toString()));
+ }
+ if (criteria.getEndDate() != null) {
+ where = where.and(
+ BuiltQuery.QueryBuilder.lte(COLUMN_TIME_NAME_V1, criteria.getEndDate().toInstant().toString()));
+ }
+
+ if (criteria.getState() != null && criteria.getOperator() != null) {
+ where = where.and(new SimpleClause(COLUMN_VALUE_NAME_V1,
+ getOperationSymbol(criteria.getOperator(), InfluxDBVersion.V1),
+ stateToObject(criteria.getState())));
+ }
+
+ if (criteria.getOrdering() == FilterCriteria.Ordering.DESCENDING) {
+ select = select.orderBy(desc());
+ } else if (criteria.getOrdering() == FilterCriteria.Ordering.ASCENDING) {
+ select = select.orderBy(asc());
+ }
+
+ if (criteria.getPageSize() != Integer.MAX_VALUE) {
+ if (criteria.getPageNumber() != 0) {
+ select = select.limit(criteria.getPageSize(), criteria.getPageSize() * criteria.getPageNumber());
+ } else {
+ select = select.limit(criteria.getPageSize());
+ }
+ }
+
+ final Query query = (Query) select;
+ return query.getCommand();
+ }
+
+ private String fullQualifiedTableName(String retentionPolicy, String tableName, boolean escapeTableName) {
+ StringBuilder sb = new StringBuilder();
+ Appender.appendName(retentionPolicy, sb);
+ sb.append(".");
+ if (escapeTableName) {
+ Appender.appendName(tableName, sb);
+ } else {
+ sb.append(tableName);
+ }
+ return sb.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal.influx1;
+
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.COLUMN_TIME_NAME_V1;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.COLUMN_VALUE_NAME_V1;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.FIELD_VALUE_NAME;
+
+import java.time.Instant;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+import java.util.Map;
+import java.util.concurrent.TimeUnit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.influxdb.InfluxDB;
+import org.influxdb.InfluxDBFactory;
+import org.influxdb.dto.Point;
+import org.influxdb.dto.Pong;
+import org.influxdb.dto.Query;
+import org.influxdb.dto.QueryResult;
+import org.openhab.persistence.influxdb.internal.InfluxDBConfiguration;
+import org.openhab.persistence.influxdb.internal.InfluxDBRepository;
+import org.openhab.persistence.influxdb.internal.InfluxPoint;
+import org.openhab.persistence.influxdb.internal.InfluxRow;
+import org.openhab.persistence.influxdb.internal.UnnexpectedConditionException;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Implementation of {@link InfluxDBRepository} for InfluxDB 1.0
+ *
+ * @author Joan Pujol Espinar - Initial contribution. Most code has been moved from
+ * {@link org.openhab.persistence.influxdb.InfluxDBPersistenceService} where it was in previous version
+ */
+@NonNullByDefault
+public class InfluxDB1RepositoryImpl implements InfluxDBRepository {
+ private final Logger logger = LoggerFactory.getLogger(InfluxDB1RepositoryImpl.class);
+ private InfluxDBConfiguration configuration;
+ @Nullable
+ private InfluxDB client;
+
+ public InfluxDB1RepositoryImpl(InfluxDBConfiguration configuration) {
+ this.configuration = configuration;
+ }
+
+ @Override
+ public boolean isConnected() {
+ return client != null;
+ }
+
+ @Override
+ public boolean connect() {
+ final InfluxDB createdClient = InfluxDBFactory.connect(configuration.getUrl(), configuration.getUser(),
+ configuration.getPassword());
+ createdClient.setDatabase(configuration.getDatabaseName());
+ createdClient.setRetentionPolicy(configuration.getRetentionPolicy());
+ createdClient.enableBatch(200, 100, TimeUnit.MILLISECONDS);
+ this.client = createdClient;
+ return checkConnectionStatus();
+ }
+
+ @Override
+ public void disconnect() {
+ this.client = null;
+ }
+
+ @Override
+ public boolean checkConnectionStatus() {
+ boolean dbStatus = false;
+ final InfluxDB currentClient = client;
+ if (currentClient != null) {
+ try {
+ Pong pong = currentClient.ping();
+ String version = pong.getVersion();
+ // may be check for version >= 0.9
+ if (version != null && !version.contains("unknown")) {
+ dbStatus = true;
+ logger.debug("database status is OK, version is {}", version);
+ } else {
+ logger.warn("database ping error, version is: \"{}\" response time was \"{}\"", version,
+ pong.getResponseTime());
+ dbStatus = false;
+ }
+ } catch (RuntimeException e) {
+ dbStatus = false;
+ logger.error("database connection failed", e);
+ handleDatabaseException(e);
+ }
+ } else {
+ logger.warn("checkConnection: database is not connected");
+ }
+ return dbStatus;
+ }
+
+ private void handleDatabaseException(Exception e) {
+ logger.warn("database error: {}", e.getMessage(), e);
+ }
+
+ @Override
+ public void write(InfluxPoint point) {
+ final InfluxDB currentClient = this.client;
+ if (currentClient != null) {
+ Point clientPoint = convertPointToClientFormat(point);
+ currentClient.write(configuration.getDatabaseName(), configuration.getRetentionPolicy(), clientPoint);
+ } else {
+ logger.warn("Write point {} ignored due to client isn't connected", point);
+ }
+ }
+
+ private Point convertPointToClientFormat(InfluxPoint point) {
+ Point.Builder clientPoint = Point.measurement(point.getMeasurementName()).time(point.getTime().toEpochMilli(),
+ TimeUnit.MILLISECONDS);
+ setPointValue(point.getValue(), clientPoint);
+ point.getTags().entrySet().forEach(e -> clientPoint.tag(e.getKey(), e.getValue()));
+ return clientPoint.build();
+ }
+
+ private void setPointValue(@Nullable Object value, Point.Builder point) {
+ if (value instanceof String) {
+ point.addField(FIELD_VALUE_NAME, (String) value);
+ } else if (value instanceof Number) {
+ point.addField(FIELD_VALUE_NAME, (Number) value);
+ } else if (value instanceof Boolean) {
+ point.addField(FIELD_VALUE_NAME, (Boolean) value);
+ } else if (value == null) {
+ point.addField(FIELD_VALUE_NAME, (String) null);
+ } else {
+ throw new UnnexpectedConditionException("Not expected value type");
+ }
+ }
+
+ @Override
+ public List<InfluxRow> query(String query) {
+ final InfluxDB currentClient = client;
+ if (currentClient != null) {
+ Query parsedQuery = new Query(query, configuration.getDatabaseName());
+ List<QueryResult.Result> results = currentClient.query(parsedQuery, TimeUnit.MILLISECONDS).getResults();
+ return convertClientResutToRepository(results);
+ } else {
+ logger.warn("Returning empty list because queryAPI isn't present");
+ return Collections.emptyList();
+ }
+ }
+
+ private List<InfluxRow> convertClientResutToRepository(List<QueryResult.Result> results) {
+ List<InfluxRow> rows = new ArrayList<>();
+ for (QueryResult.Result result : results) {
+ List<QueryResult.Series> seriess = result.getSeries();
+ if (result.getError() != null) {
+ logger.warn("{}", result.getError());
+ continue;
+ }
+ if (seriess == null) {
+ logger.debug("query returned no series");
+ } else {
+ for (QueryResult.Series series : seriess) {
+ logger.trace("series {}", series.toString());
+ String itemName = series.getName();
+ List<List<Object>> valuess = series.getValues();
+ if (valuess == null) {
+ logger.debug("query returned no values");
+ } else {
+ List<String> columns = series.getColumns();
+ logger.trace("columns {}", columns);
+ if (columns != null) {
+ Integer timestampColumn = null;
+ Integer valueColumn = null;
+ for (int i = 0; i < columns.size(); i++) {
+ String columnName = columns.get(i);
+ if (columnName.equals(COLUMN_TIME_NAME_V1)) {
+ timestampColumn = i;
+ } else if (columnName.equals(COLUMN_VALUE_NAME_V1)) {
+ valueColumn = i;
+ }
+ }
+ if (valueColumn == null || timestampColumn == null) {
+ throw new IllegalStateException("missing column");
+ }
+ for (int i = 0; i < valuess.size(); i++) {
+ Double rawTime = (Double) valuess.get(i).get(timestampColumn);
+ Instant time = Instant.ofEpochMilli(rawTime.longValue());
+ Object value = valuess.get(i).get(valueColumn);
+ logger.trace("adding historic item {}: time {} value {}", itemName, time, value);
+ rows.add(new InfluxRow(time, itemName, value));
+ }
+ }
+ }
+ }
+ }
+ }
+ return rows;
+ }
+
+ @Override
+ public Map<String, Integer> getStoredItemsCount() {
+ return Collections.emptyMap();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal.influx2;
+
+import static com.influxdb.query.dsl.functions.restriction.Restrictions.measurement;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.*;
+import static org.openhab.persistence.influxdb.internal.InfluxDBStateConvertUtils.stateToObject;
+
+import java.time.temporal.ChronoUnit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.persistence.influxdb.internal.FilterCriteriaQueryCreator;
+import org.openhab.persistence.influxdb.internal.InfluxDBVersion;
+
+import com.influxdb.query.dsl.Flux;
+import com.influxdb.query.dsl.functions.RangeFlux;
+import com.influxdb.query.dsl.functions.restriction.Restrictions;
+
+/**
+ * Implementation of {@link FilterCriteriaQueryCreator} for InfluxDB 2.0
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class Influx2FilterCriteriaQueryCreatorImpl implements FilterCriteriaQueryCreator {
+ @Override
+ public String createQuery(FilterCriteria criteria, String retentionPolicy) {
+ Flux flux = Flux.from(retentionPolicy);
+
+ if (criteria.getBeginDate() != null || criteria.getEndDate() != null) {
+ RangeFlux range = flux.range();
+ if (criteria.getBeginDate() != null) {
+ range = range.withStart(criteria.getBeginDate().toInstant());
+ }
+ if (criteria.getEndDate() != null) {
+ range = range.withStop(criteria.getEndDate().toInstant());
+ }
+ flux = range;
+ } else {
+ flux = flux.range(-100L, ChronoUnit.YEARS); // Flux needs a mandatory range
+ }
+
+ if (criteria.getItemName() != null) {
+ flux = flux.filter(measurement().equal(criteria.getItemName()));
+ }
+
+ if (criteria.getState() != null && criteria.getOperator() != null) {
+ Restrictions restrictions = Restrictions.and(Restrictions.field().equal(FIELD_VALUE_NAME),
+ Restrictions.value().custom(stateToObject(criteria.getState()),
+ getOperationSymbol(criteria.getOperator(), InfluxDBVersion.V2)));
+ flux = flux.filter(restrictions);
+ }
+
+ if (criteria.getOrdering() != null) {
+ boolean desc = criteria.getOrdering() == FilterCriteria.Ordering.DESCENDING;
+ flux = flux.sort().withDesc(desc).withColumns(new String[] { COLUMN_TIME_NAME_V2 });
+ }
+
+ if (criteria.getPageSize() != Integer.MAX_VALUE) {
+ flux = flux.limit(criteria.getPageSize()).withPropertyValue("offset",
+ criteria.getPageNumber() * criteria.getPageSize());
+ }
+
+ return flux.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal.influx2;
+
+import static org.openhab.persistence.influxdb.internal.InfluxDBConstants.*;
+
+import java.time.Instant;
+import java.util.Collections;
+import java.util.LinkedHashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.persistence.influxdb.internal.InfluxDBConfiguration;
+import org.openhab.persistence.influxdb.internal.InfluxDBConstants;
+import org.openhab.persistence.influxdb.internal.InfluxDBRepository;
+import org.openhab.persistence.influxdb.internal.InfluxPoint;
+import org.openhab.persistence.influxdb.internal.InfluxRow;
+import org.openhab.persistence.influxdb.internal.UnnexpectedConditionException;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.influxdb.client.InfluxDBClient;
+import com.influxdb.client.InfluxDBClientFactory;
+import com.influxdb.client.InfluxDBClientOptions;
+import com.influxdb.client.QueryApi;
+import com.influxdb.client.WriteApi;
+import com.influxdb.client.domain.Ready;
+import com.influxdb.client.domain.WritePrecision;
+import com.influxdb.client.write.Point;
+import com.influxdb.query.FluxTable;
+
+/**
+ * Implementation of {@link InfluxDBRepository} for InfluxDB 2.0
+ *
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class InfluxDB2RepositoryImpl implements InfluxDBRepository {
+ private final Logger logger = LoggerFactory.getLogger(InfluxDB2RepositoryImpl.class);
+ private InfluxDBConfiguration configuration;
+ @Nullable
+ private InfluxDBClient client;
+ @Nullable
+ private QueryApi queryAPI;
+ @Nullable
+ private WriteApi writeAPI;
+
+ public InfluxDB2RepositoryImpl(InfluxDBConfiguration configuration) {
+ this.configuration = configuration;
+ }
+
+ /**
+ * Returns if the client has been successfully connected to server
+ *
+ * @return True if it's connected, otherwise false
+ */
+ @Override
+ public boolean isConnected() {
+ return client != null;
+ }
+
+ /**
+ * Connect to InfluxDB server
+ *
+ * @return True if successful, otherwise false
+ */
+ @Override
+ public boolean connect() {
+ InfluxDBClientOptions.Builder optionsBuilder = InfluxDBClientOptions.builder().url(configuration.getUrl())
+ .org(configuration.getDatabaseName()).bucket(configuration.getRetentionPolicy());
+ char[] token = configuration.getTokenAsCharArray();
+ if (token.length > 0) {
+ optionsBuilder.authenticateToken(token);
+ } else {
+ optionsBuilder.authenticate(configuration.getUser(), configuration.getPassword().toCharArray());
+ }
+ InfluxDBClientOptions clientOptions = optionsBuilder.build();
+
+ final InfluxDBClient createdClient = InfluxDBClientFactory.create(clientOptions);
+ this.client = createdClient;
+ logger.debug("Succesfully connected to InfluxDB. Instance ready={}", createdClient.ready());
+ queryAPI = createdClient.getQueryApi();
+ writeAPI = createdClient.getWriteApi();
+ return checkConnectionStatus();
+ }
+
+ /**
+ * Disconnect from InfluxDB server
+ */
+ @Override
+ public void disconnect() {
+ final InfluxDBClient currentClient = this.client;
+ if (currentClient != null) {
+ currentClient.close();
+ }
+ this.client = null;
+ }
+
+ /**
+ * Check if connection is currently ready
+ *
+ * @return True if its ready, otherwise false
+ */
+ @Override
+ public boolean checkConnectionStatus() {
+ final InfluxDBClient currentClient = client;
+ if (currentClient != null) {
+ Ready ready = currentClient.ready();
+ boolean isUp = ready != null && ready.getStatus() == Ready.StatusEnum.READY;
+ if (isUp) {
+ logger.debug("database status is OK");
+ } else {
+ logger.warn("database not ready");
+ }
+ return isUp;
+ } else {
+ logger.warn("checkConnection: database is not connected");
+ return false;
+ }
+ }
+
+ /**
+ * Write point to database
+ *
+ * @param point
+ */
+ @Override
+ public void write(InfluxPoint point) {
+ final WriteApi currentWriteAPI = writeAPI;
+ if (currentWriteAPI != null) {
+ currentWriteAPI.writePoint(convertPointToClientFormat(point));
+ } else {
+ logger.warn("Write point {} ignored due to writeAPI isn't present", point);
+ }
+ }
+
+ private Point convertPointToClientFormat(InfluxPoint point) {
+ Point clientPoint = Point.measurement(point.getMeasurementName()).time(point.getTime(), WritePrecision.MS);
+ setPointValue(point.getValue(), clientPoint);
+ point.getTags().entrySet().forEach(e -> clientPoint.addTag(e.getKey(), e.getValue()));
+ return clientPoint;
+ }
+
+ private void setPointValue(@Nullable Object value, Point point) {
+ if (value instanceof String) {
+ point.addField(FIELD_VALUE_NAME, (String) value);
+ } else if (value instanceof Number) {
+ point.addField(FIELD_VALUE_NAME, (Number) value);
+ } else if (value instanceof Boolean) {
+ point.addField(FIELD_VALUE_NAME, (Boolean) value);
+ } else if (value == null) {
+ point.addField(FIELD_VALUE_NAME, (String) null);
+ } else {
+ throw new UnnexpectedConditionException("Not expected value type");
+ }
+ }
+
+ /**
+ * Executes Flux query
+ *
+ * @param query Query
+ * @return Query results
+ */
+ @Override
+ public List<InfluxRow> query(String query) {
+ final QueryApi currentQueryAPI = queryAPI;
+ if (currentQueryAPI != null) {
+ List<FluxTable> clientResult = currentQueryAPI.query(query);
+ return convertClientResutToRepository(clientResult);
+ } else {
+ logger.warn("Returning empty list because queryAPI isn't present");
+ return Collections.emptyList();
+ }
+ }
+
+ private List<InfluxRow> convertClientResutToRepository(List<FluxTable> clientResult) {
+ return clientResult.stream().flatMap(this::mapRawResultToHistoric).collect(Collectors.toList());
+ }
+
+ private Stream<InfluxRow> mapRawResultToHistoric(FluxTable rawRow) {
+ return rawRow.getRecords().stream().map(r -> {
+ String itemName = (String) r.getValueByKey(InfluxDBConstants.TAG_ITEM_NAME);
+ Object value = r.getValueByKey(COLUMN_VALUE_NAME_V2);
+ Instant time = (Instant) r.getValueByKey(COLUMN_TIME_NAME_V2);
+ return new InfluxRow(time, itemName, value);
+ });
+ }
+
+ /**
+ * Return all stored item names with it's count of stored points
+ *
+ * @return Map with <ItemName,ItemCount> entries
+ */
+ @Override
+ public Map<String, Integer> getStoredItemsCount() {
+ final QueryApi currentQueryAPI = queryAPI;
+
+ if (currentQueryAPI != null) {
+ Map<String, Integer> result = new LinkedHashMap<>();
+ // Query wrote by hand https://github.com/influxdata/influxdb-client-java/issues/75
+ String query = "from(bucket: \"" + configuration.getRetentionPolicy() + "\")\n"
+ + " |> range(start:-365d)\n" + " |> filter(fn: (r) => exists r." + TAG_ITEM_NAME + " )\n"
+ + " |> group(columns: [\"" + TAG_ITEM_NAME + "\"], mode:\"by\")\n" + " |> count()\n"
+ + " |> group()";
+
+ List<FluxTable> queryResult = currentQueryAPI.query(query);
+ queryResult.stream().findFirst().orElse(new FluxTable()).getRecords().forEach(row -> {
+ result.put((String) row.getValueByKey(TAG_ITEM_NAME), ((Number) row.getValue()).intValue());
+ });
+ return result;
+ } else {
+ logger.warn("Returning empty result because queryAPI isn't present");
+ return Collections.emptyMap();
+ }
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<config-description:config-descriptions
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:config-description="https://openhab.org/schemas/config-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/config-description/v1.0.0
+ https://openhab.org/schemas/config-description-1.0.0.xsd">
+ <config-description uri="persistence:influxdb">
+
+ <parameter-group name="connection">
+ <label>Connection</label>
+ <description>This group defines connection parameters.</description>
+ <advanced>false</advanced>
+ </parameter-group>
+
+ <parameter-group name="tags">
+ <label>Additional Tags</label>
+ <description>This group defines additional tags which can be added.</description>
+ <advanced>false</advanced>
+ </parameter-group>
+
+ <parameter-group name="misc">
+ <label>Miscellaneous</label>
+ <description>This group defines miscellaneous parameters.</description>
+ <advanced>false</advanced>
+ </parameter-group>
+
+ <parameter name="url" type="text" required="true" groupName="connection">
+ <context>url</context>
+ <label>Database URL</label>
+ <description>The database URL, e.g. http://127.0.0.1:8086 or http://127.0.0.1:9999</description>
+ <default>http://127.0.0.1:8086</default>
+ </parameter>
+
+ <parameter name="version" type="text" required="true" groupName="connection">
+ <label>Database Version</label>
+ <description>InfluxDB version</description>
+ <default>V1</default>
+ <options>
+ <option value="V1">InfluxDB 1</option>
+ <option value="V2">InfluxDB 2</option>
+ </options>
+ </parameter>
+
+ <parameter name="user" type="text" required="true" groupName="connection">
+ <label>Username</label>
+ <description>Database username</description>
+ <default>openhab</default>
+ </parameter>
+
+ <parameter name="password" type="text" required="false" groupName="connection">
+ <context>password</context>
+ <label>Database Password</label>
+ <description>Database password</description>
+ </parameter>
+
+ <parameter name="token" type="text" required="false" groupName="connection">
+ <label>Authentication Token</label>
+ <description>The token to authenticate to database (alternative to username/password for InfluxDB 2.0)
+ </description>
+ </parameter>
+
+ <parameter name="db" type="text" required="true" groupName="connection">
+ <label>Database/Organization</label>
+ <description>The name of the database (InfluxDB 1.0) or Organization for (InfluxDB 2.0)</description>
+ <default>openhab</default>
+ </parameter>
+
+ <parameter name="retentionPolicy" type="text" required="true" groupName="connection">
+ <label>Retention Policy / Bucket</label>
+ <description>The name of the retention policy (Influx DB 1.0) or bucket (InfluxDB 2.0) to write data
+ </description>
+ <default>openhab</default>
+ </parameter>
+
+ <parameter name="replaceUnderscore" type="boolean" required="true" groupName="misc">
+ <label>Replace Underscore</label>
+ <description>Whether underscores "_" in item names should be replaced by a dot "." ("test_item" ->
+ "test.item"). Only
+ for measurement name, not for tags. Also applies to alias names.
+ </description>
+ <default>false</default>
+ </parameter>
+
+ <parameter name="addCategoryTag" type="boolean" required="true" groupName="tags">
+ <label>Add Category Tag</label>
+ <description>Should the category of the item be included as tag "category"? If no category is set, "n/a" is
+ used.
+ </description>
+ <default>false</default>
+ </parameter>
+
+ <parameter name="addTypeTag" type="boolean" required="true" groupName="tags">
+ <label>Add Type Tag</label>
+ <description>Should the item type be included as tag "type"?</description>
+ <default>false</default>
+ </parameter>
+
+ <parameter name="addLabelTag" type="boolean" required="true" groupName="tags">
+ <label>Add Label Tag</label>
+ <description>Should the item label be included as tag "label"? If no label is set, "n/a" is used.
+ </description>
+ <default>false</default>
+ </parameter>
+
+ </config-description>
+</config-description:config-descriptions>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.openhab.persistence.influxdb.internal.InfluxDBConfiguration.DATABASE_PARAM;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConfiguration.RETENTION_POLICY_PARAM;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConfiguration.TOKEN_PARAM;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConfiguration.URL_PARAM;
+import static org.openhab.persistence.influxdb.internal.InfluxDBConfiguration.VERSION_PARAM;
+
+import java.util.HashMap;
+import java.util.Map;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+
+/**
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class ConfigurationTestHelper {
+
+ public static Map<String, @Nullable Object> createValidConfigurationParameters() {
+ Map<String, @Nullable Object> config = new HashMap<>();
+ config.put(URL_PARAM, "http://localhost:9999");
+ config.put(VERSION_PARAM, InfluxDBVersion.V2.name());
+ config.put(TOKEN_PARAM, "sampletoken");
+ config.put(DATABASE_PARAM, "openhab");
+ config.put(RETENTION_POLICY_PARAM, "default");
+ return config;
+ }
+
+ public static InfluxDBConfiguration createValidConfiguration() {
+ return new InfluxDBConfiguration(createValidConfigurationParameters());
+ }
+
+ public static Map<String, @Nullable Object> createInvalidConfigurationParameters() {
+ Map<String, @Nullable Object> config = createValidConfigurationParameters();
+ config.remove(TOKEN_PARAM);
+ return config;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.mockito.ArgumentMatchers.any;
+import static org.mockito.Mockito.*;
+
+import java.util.Map;
+
+import org.eclipse.jdt.annotation.DefaultLocation;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.items.MetadataRegistry;
+import org.openhab.persistence.influxdb.InfluxDBPersistenceService;
+
+/**
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@ExtendWith(MockitoExtension.class)
+@NonNullByDefault(value = { DefaultLocation.PARAMETER, DefaultLocation.RETURN_TYPE })
+public class InfluxDBPersistenceServiceTest {
+ private InfluxDBPersistenceService instance;
+
+ private @Mock InfluxDBRepository influxDBRepository;
+
+ private Map<String, @Nullable Object> validConfig;
+ private Map<String, @Nullable Object> invalidConfig;
+
+ @BeforeEach
+ public void before() {
+ instance = new InfluxDBPersistenceService(mock(ItemRegistry.class), mock(MetadataRegistry.class)) {
+ @Override
+ protected InfluxDBRepository createInfluxDBRepository() {
+ return influxDBRepository;
+ }
+ };
+
+ validConfig = ConfigurationTestHelper.createValidConfigurationParameters();
+ invalidConfig = ConfigurationTestHelper.createInvalidConfigurationParameters();
+ }
+
+ @AfterEach
+ public void after() {
+ validConfig = null;
+ invalidConfig = null;
+ instance = null;
+ influxDBRepository = null;
+ }
+
+ @Test
+ public void activateWithValidConfigShouldConnectRepository() {
+ instance.activate(validConfig);
+ verify(influxDBRepository).connect();
+ }
+
+ @Test
+ public void activateWithInvalidConfigShouldNotConnectRepository() {
+ instance.activate(invalidConfig);
+ verify(influxDBRepository, never()).connect();
+ }
+
+ @Test
+ public void activateWithNullConfigShouldNotConnectRepository() {
+ instance.activate(null);
+ verify(influxDBRepository, never()).connect();
+ }
+
+ @Test
+ public void deactivateShouldDisconnectRepository() {
+ instance.activate(validConfig);
+ instance.deactivate();
+ verify(influxDBRepository).disconnect();
+ }
+
+ @Test
+ public void storeItemWithConnectedRepository() {
+ instance.activate(validConfig);
+ when(influxDBRepository.isConnected()).thenReturn(true);
+ instance.store(ItemTestHelper.createNumberItem("number", 5));
+ verify(influxDBRepository).write(any());
+ }
+
+ @Test
+ public void storeItemWithDisconnectedRepositoryIsIgnored() {
+ instance.activate(validConfig);
+ when(influxDBRepository.isConnected()).thenReturn(false);
+ instance.store(ItemTestHelper.createNumberItem("number", 5));
+ verify(influxDBRepository, never()).write(any());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.*;
+
+import java.math.BigDecimal;
+import java.time.Instant;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+
+/**
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class InfluxDBStateConvertUtilsTest {
+
+ @Test
+ public void convertDecimalState() {
+ DecimalType decimalType = new DecimalType(new BigDecimal("1.12"));
+ assertThat((Double) InfluxDBStateConvertUtils.stateToObject(decimalType), closeTo(1.12, 0.01));
+ }
+
+ @Test
+ public void convertOnOffState() {
+ assertThat(InfluxDBStateConvertUtils.stateToObject(OpenClosedType.OPEN), equalTo(1));
+ assertThat(InfluxDBStateConvertUtils.stateToObject(OnOffType.ON), equalTo(1));
+ }
+
+ @Test
+ public void convertDateTimeState() {
+ ZonedDateTime now = ZonedDateTime.now();
+ long nowInMillis = now.toInstant().toEpochMilli();
+ DateTimeType type = new DateTimeType(now);
+ assertThat(InfluxDBStateConvertUtils.stateToObject(type), equalTo(nowInMillis));
+ }
+
+ @Test
+ public void convertDecimalToState() {
+ BigDecimal val = new BigDecimal("1.12");
+ NumberItem item = new NumberItem("name");
+ assertThat(InfluxDBStateConvertUtils.objectToState(val, item), equalTo(new DecimalType(val)));
+ }
+
+ @Test
+ public void convertOnOffToState() {
+ boolean val1 = true;
+ int val2 = 1;
+ SwitchItem onOffItem = new SwitchItem("name");
+ ContactItem contactItem = new ContactItem("name");
+ assertThat(InfluxDBStateConvertUtils.objectToState(val1, onOffItem), equalTo(OnOffType.ON));
+ assertThat(InfluxDBStateConvertUtils.objectToState(val2, onOffItem), equalTo(OnOffType.ON));
+ assertThat(InfluxDBStateConvertUtils.objectToState(val1, contactItem), equalTo(OpenClosedType.OPEN));
+ assertThat(InfluxDBStateConvertUtils.objectToState(val2, contactItem), equalTo(OpenClosedType.OPEN));
+ }
+
+ @Test
+ public void convertDateTimeToState() {
+ long val = System.currentTimeMillis();
+ DateTimeItem item = new DateTimeItem("name");
+
+ DateTimeType expected = new DateTimeType(
+ ZonedDateTime.ofInstant(Instant.ofEpochMilli(val), ZoneId.systemDefault()));
+ assertThat(InfluxDBStateConvertUtils.objectToState(val, item), equalTo(expected));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.hamcrest.CoreMatchers.equalTo;
+import static org.hamcrest.MatcherAssert.assertThat;
+
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.time.format.DateTimeFormatter;
+import java.time.temporal.ChronoUnit;
+
+import org.eclipse.jdt.annotation.DefaultLocation;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.persistence.influxdb.internal.influx1.Influx1FilterCriteriaQueryCreatorImpl;
+import org.openhab.persistence.influxdb.internal.influx2.Influx2FilterCriteriaQueryCreatorImpl;
+
+/**
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault({ DefaultLocation.RETURN_TYPE, DefaultLocation.PARAMETER })
+public class InfluxFilterCriteriaQueryCreatorImplTest {
+ private static final String RETENTION_POLICY = "origin";
+ public static final String ITEM_NAME = "sampleItem";
+
+ private static final DateTimeFormatter INFLUX2_DATE_FORMATTER = DateTimeFormatter
+ .ofPattern("yyyy-MM-dd'T'HH:mm:ss.nnnnnnnnn'Z'").withZone(ZoneId.of("UTC"));
+
+ private Influx1FilterCriteriaQueryCreatorImpl instanceV1;
+ private Influx2FilterCriteriaQueryCreatorImpl instanceV2;
+
+ @BeforeEach
+ public void before() {
+ instanceV1 = new Influx1FilterCriteriaQueryCreatorImpl();
+ instanceV2 = new Influx2FilterCriteriaQueryCreatorImpl();
+ }
+
+ @AfterEach
+ public void after() {
+ instanceV1 = null;
+ instanceV2 = null;
+ }
+
+ @Test
+ public void testSimpleItemQueryWithoutParams() {
+ FilterCriteria criteria = createBaseCriteria();
+
+ String queryV1 = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV1, equalTo("SELECT value FROM origin.sampleItem;"));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV2, equalTo("from(bucket:\"origin\")\n\t" + "|> range(start:-100y)\n\t"
+ + "|> filter(fn: (r) => r[\"_measurement\"] == \"sampleItem\")"));
+ }
+
+ @Test
+ public void testEscapeSimpleItem() {
+ FilterCriteria criteria = createBaseCriteria("sample.Item");
+
+ String queryV1 = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV1, equalTo("SELECT value FROM origin.\"sample.Item\";"));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV2, equalTo("from(bucket:\"origin\")\n\t" + "|> range(start:-100y)\n\t"
+ + "|> filter(fn: (r) => r[\"_measurement\"] == \"sample.Item\")"));
+ }
+
+ @Test
+ public void testSimpleUnboundedItemWithoutParams() {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setOrdering(null);
+
+ String queryV1 = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV1, equalTo("SELECT value FROM origin./.*/;"));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV2, equalTo("from(bucket:\"origin\")\n\t" + "|> range(start:-100y)"));
+ }
+
+ @Test
+ public void testRangeCriteria() {
+ FilterCriteria criteria = createBaseCriteria();
+ ZonedDateTime now = ZonedDateTime.now();
+ ZonedDateTime tomorrow = now.plus(1, ChronoUnit.DAYS);
+ criteria.setBeginDate(now);
+ criteria.setEndDate(tomorrow);
+
+ String queryV1 = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ String expectedQueryV1 = String.format(
+ "SELECT value FROM origin.sampleItem WHERE time >= '%s' AND time <= '%s';", now.toInstant(),
+ tomorrow.toInstant());
+ assertThat(queryV1, equalTo(expectedQueryV1));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ String expectedQueryV2 = String.format(
+ "from(bucket:\"origin\")\n\t" + "|> range(start:%s, stop:%s)\n\t"
+ + "|> filter(fn: (r) => r[\"_measurement\"] == \"sampleItem\")",
+ INFLUX2_DATE_FORMATTER.format(now.toInstant()), INFLUX2_DATE_FORMATTER.format(tomorrow.toInstant()));
+ assertThat(queryV2, equalTo(expectedQueryV2));
+ }
+
+ @Test
+ public void testValueOperator() {
+ FilterCriteria criteria = createBaseCriteria();
+ criteria.setOperator(FilterCriteria.Operator.LTE);
+ criteria.setState(new PercentType(90));
+
+ String query = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ assertThat(query, equalTo("SELECT value FROM origin.sampleItem WHERE value <= 90;"));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV2,
+ equalTo("from(bucket:\"origin\")\n\t" + "|> range(start:-100y)\n\t"
+ + "|> filter(fn: (r) => r[\"_measurement\"] == \"sampleItem\")\n\t"
+ + "|> filter(fn: (r) => (r[\"_field\"] == \"value\" and r[\"_value\"] <= 90))"));
+ }
+
+ @Test
+ public void testPagination() {
+ FilterCriteria criteria = createBaseCriteria();
+ criteria.setPageNumber(2);
+ criteria.setPageSize(10);
+
+ String query = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ assertThat(query, equalTo("SELECT value FROM origin.sampleItem LIMIT 10 OFFSET 20;"));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV2, equalTo("from(bucket:\"origin\")\n\t" + "|> range(start:-100y)\n\t"
+ + "|> filter(fn: (r) => r[\"_measurement\"] == \"sampleItem\")\n\t" + "|> limit(n:10, offset:20)"));
+ }
+
+ @Test
+ public void testOrdering() {
+ FilterCriteria criteria = createBaseCriteria();
+ criteria.setOrdering(FilterCriteria.Ordering.ASCENDING);
+
+ String query = instanceV1.createQuery(criteria, RETENTION_POLICY);
+ assertThat(query, equalTo("SELECT value FROM origin.sampleItem ORDER BY time ASC;"));
+
+ String queryV2 = instanceV2.createQuery(criteria, RETENTION_POLICY);
+ assertThat(queryV2,
+ equalTo("from(bucket:\"origin\")\n\t" + "|> range(start:-100y)\n\t"
+ + "|> filter(fn: (r) => r[\"_measurement\"] == \"sampleItem\")\n\t"
+ + "|> sort(desc:false, columns:[\"_time\"])"));
+ }
+
+ private FilterCriteria createBaseCriteria() {
+ return createBaseCriteria(ITEM_NAME);
+ }
+
+ private FilterCriteria createBaseCriteria(String sampleItem) {
+ FilterCriteria criteria = new FilterCriteria();
+ criteria.setItemName(sampleItem);
+ criteria.setOrdering(null);
+ return criteria;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.types.DecimalType;
+
+/**
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@NonNullByDefault
+public class ItemTestHelper {
+
+ public static NumberItem createNumberItem(String name, int value) {
+ NumberItem numberItem = new NumberItem(name);
+ numberItem.setState(new DecimalType(value));
+ return numberItem;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.influxdb.internal;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.Matchers.*;
+import static org.mockito.Mockito.when;
+
+import java.math.BigInteger;
+import java.util.Map;
+
+import org.eclipse.jdt.annotation.DefaultLocation;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.ExtendWith;
+import org.mockito.Mock;
+import org.mockito.junit.jupiter.MockitoExtension;
+import org.openhab.core.items.Metadata;
+import org.openhab.core.items.MetadataKey;
+import org.openhab.core.items.MetadataRegistry;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.persistence.influxdb.InfluxDBPersistenceService;
+
+/**
+ * @author Joan Pujol Espinar - Initial contribution
+ */
+@ExtendWith(MockitoExtension.class)
+@SuppressWarnings("null") // In case of any NPE it will cause test fail that it's the expected result
+@NonNullByDefault(value = { DefaultLocation.PARAMETER, DefaultLocation.RETURN_TYPE })
+public class ItemToStorePointCreatorTest {
+
+ private @Mock InfluxDBConfiguration influxDBConfiguration;
+ private @Mock MetadataRegistry metadataRegistry;
+ private ItemToStorePointCreator instance;
+
+ @BeforeEach
+ public void before() {
+ when(influxDBConfiguration.isAddCategoryTag()).thenReturn(false);
+ when(influxDBConfiguration.isAddLabelTag()).thenReturn(false);
+ when(influxDBConfiguration.isAddTypeTag()).thenReturn(false);
+ when(influxDBConfiguration.isReplaceUnderscore()).thenReturn(false);
+
+ instance = new ItemToStorePointCreator(influxDBConfiguration, metadataRegistry);
+ }
+
+ @AfterEach
+ public void after() {
+ instance = null;
+ influxDBConfiguration = null;
+ metadataRegistry = null;
+ }
+
+ @Test
+ public void convertBasicItem() {
+ NumberItem item = ItemTestHelper.createNumberItem("myitem", 5);
+ InfluxPoint point = instance.convert(item, null);
+
+ assertThat(point.getMeasurementName(), equalTo(item.getName()));
+ assertThat("Must Store item name", point.getTags(), hasEntry("item", item.getName()));
+ assertThat(point.getValue(), equalTo(new BigInteger("5")));
+ }
+
+ @Test
+ public void shouldUseAliasAsMeasurementNameIfProvided() {
+ NumberItem item = ItemTestHelper.createNumberItem("myitem", 5);
+ InfluxPoint point = instance.convert(item, "aliasName");
+ assertThat(point.getMeasurementName(), is("aliasName"));
+ }
+
+ @Test
+ public void shouldStoreCategoryTagIfProvidedAndConfigured() {
+ NumberItem item = ItemTestHelper.createNumberItem("myitem", 5);
+ item.setCategory("categoryValue");
+
+ when(influxDBConfiguration.isAddCategoryTag()).thenReturn(true);
+ InfluxPoint point = instance.convert(item, null);
+ assertThat(point.getTags(), hasEntry(InfluxDBConstants.TAG_CATEGORY_NAME, "categoryValue"));
+
+ when(influxDBConfiguration.isAddCategoryTag()).thenReturn(false);
+ point = instance.convert(item, null);
+ assertThat(point.getTags(), not(hasKey(InfluxDBConstants.TAG_CATEGORY_NAME)));
+ }
+
+ @Test
+ public void shouldStoreTypeTagIfProvidedAndConfigured() {
+ NumberItem item = ItemTestHelper.createNumberItem("myitem", 5);
+
+ when(influxDBConfiguration.isAddTypeTag()).thenReturn(true);
+ InfluxPoint point = instance.convert(item, null);
+ assertThat(point.getTags(), hasEntry(InfluxDBConstants.TAG_TYPE_NAME, "Number"));
+
+ when(influxDBConfiguration.isAddTypeTag()).thenReturn(false);
+ point = instance.convert(item, null);
+ assertThat(point.getTags(), not(hasKey(InfluxDBConstants.TAG_TYPE_NAME)));
+ }
+
+ @Test
+ public void shouldStoreTypeLabelIfProvidedAndConfigured() {
+ NumberItem item = ItemTestHelper.createNumberItem("myitem", 5);
+ item.setLabel("ItemLabel");
+
+ when(influxDBConfiguration.isAddLabelTag()).thenReturn(true);
+ InfluxPoint point = instance.convert(item, null);
+ assertThat(point.getTags(), hasEntry(InfluxDBConstants.TAG_LABEL_NAME, "ItemLabel"));
+
+ when(influxDBConfiguration.isAddLabelTag()).thenReturn(false);
+ point = instance.convert(item, null);
+ assertThat(point.getTags(), not(hasKey(InfluxDBConstants.TAG_LABEL_NAME)));
+ }
+
+ @Test
+ public void shouldStoreMetadataAsTagsIfProvided() {
+ NumberItem item = ItemTestHelper.createNumberItem("myitem", 5);
+ MetadataKey metadataKey = new MetadataKey(InfluxDBPersistenceService.SERVICE_NAME, item.getName());
+
+ when(metadataRegistry.get(metadataKey))
+ .thenReturn(new Metadata(metadataKey, "", Map.of("key1", "val1", "key2", "val2")));
+
+ InfluxPoint point = instance.convert(item, null);
+ assertThat(point.getTags(), hasEntry("key1", "val1"));
+ assertThat(point.getTags(), hasEntry("key2", "val2"));
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="test" value="true"/>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+/build/
+/drivers/
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.jdbc</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# JDBC Persistence
+
+This service writes and reads item states to and from a number of relational database systems that support [Java Database Connectivity (JDBC)](https://en.wikipedia.org/wiki/Java_Database_Connectivity).
+This service allows you to persist state updates using one of several different underlying database services.
+It is designed for a maximum of scalability, to store very large amounts of data and still over the years not lose its speed.
+
+The generic design makes it relatively easy for developers to integrate other databases that have JDBC drivers.
+The following databases are currently supported and tested:
+
+| Database | Tested Driver / Version |
+| -------------------------------------------- | ------------------------------------------------------------ |
+| [Apache Derby](https://db.apache.org/derby/) | [derby-10.12.1.1.jar](https://mvnrepository.com/artifact/org.apache.derby/derby) |
+| [H2](https://www.h2database.com/) | [h2-1.4.191.jar](https://mvnrepository.com/artifact/com.h2database/h2) |
+| [HSQLDB](http://hsqldb.org/) | [hsqldb-2.3.3.jar](https://mvnrepository.com/artifact/org.hsqldb/hsqldb) |
+| [MariaDB](https://mariadb.org/) | [mariadb-java-client-1.4.6.jar](https://mvnrepository.com/artifact/org.mariadb.jdbc/mariadb-java-client) |
+| [MySQL](https://www.mysql.com/) | [mysql-connector-java-5.1.39.jar](https://mvnrepository.com/artifact/mysql/mysql-connector-java) |
+| [PostgreSQL](https://www.postgresql.org/) | [postgresql-9.4.1209.jre7.jar](https://mvnrepository.com/artifact/org.postgresql/postgresql) |
+| [SQLite](https://www.sqlite.org/) | [sqlite-jdbc-3.16.1.jar](https://mvnrepository.com/artifact/org.xerial/sqlite-jdbc) |
+
+## Table of Contents
+
+<!-- MarkdownTOC -->
+
+- [Configuration](#configuration)
+ - [Minimal Configuration](#minimal-configuration)
+ - [Migration from MySQL to JDBC Persistence Services](#migration-from-mysql-to-jdbc-persistence-services)
+- [Technical Notes](#technical-notes)
+ - [Database Table Schema](#database-table-schema)
+ - [Number Precision](#number-precision)
+ - [Rounding results](#rounding-results)
+ - [For Developers](#for-developers)
+ - [Performance Tests](#performance-tests)
+
+<!-- /MarkdownTOC -->
+
+## Configuration
+
+This service can be configured in the file `services/jdbc.cfg`.
+
+| Property | Default | Required | Description |
+| ------------------------- | ------------------------------------------------------------ | :-------: | ------------------------------------------------------------ |
+| url | | Yes | JDBC URL to establish a connection to your database. Examples:<br/><br/>`jdbc:derby:./testDerby;create=true`<br/>`jdbc:h2:./testH2`<br/>`jdbc:hsqldb:./testHsqlDb`<br/>`jdbc:mariadb://192.168.0.1:3306/testMariadb`<br/>`jdbc:mysql://192.168.0.1:3306/testMysql?serverTimezone=UTC`<br/>`jdbc:postgresql://192.168.0.1:5432/testPostgresql`<br/>`jdbc:sqlite:./testSqlite.db`.<br/><br/>If no database is available it will be created; for example the url `jdbc:h2:./testH2` creates a new H2 database in openHAB folder. Example to create your own MySQL database directly:<br/><br/>`CREATE DATABASE 'yourDB' CHARACTER SET utf8 COLLATE utf8_general_ci;` |
+| user | | if needed | database user name |
+| password | | if needed | database user password |
+| errReconnectThreshold | 0 | No | when the service is deactivated (0 means ignore) |
+| sqltype.CALL | `VARCHAR(200)` | No | All `sqlType` options allow you to change the SQL data type used to store values for different openHAB item states. See the following links for further information: [mybatis](https://mybatis.github.io/mybatis-3/apidocs/reference/org/apache/ibatis/type/JdbcType.html) [H2](http://www.h2database.com/html/datatypes.html) [PostgresSQL](http://www.postgresql.org/docs/9.3/static/datatype.html) |
+| sqltype.COLOR | `VARCHAR(70)` | No | see above |
+| sqltype.CONTACT | `VARCHAR(6)` | No | see above |
+| sqltype.DATETIME | `DATETIME` | No | see above |
+| sqltype.DIMMER | `TINYINT` | No | see above |
+| sqltype.LOCATION | `VARCHAR(30)` | No | see above |
+| sqltype.NUMBER | `DOUBLE` | No | see above |
+| sqltype.ROLLERSHUTTER | `TINYINT` | No | see above |
+| sqltype.STRING | `VARCHAR(65500)` | No | see above |
+| sqltype.SWITCH | `VARCHAR(6)` | No | see above |
+| sqltype.tablePrimaryKey | `TIMESTAMP` | No | type of `time` column for newly created item tables |
+| sqltype.tablePrimaryValue | `NOW()` | No | value of `time` column for newly inserted rows |
+| numberDecimalcount | 3 | No | for Itemtype "Number" default decimal digit count |
+| tableNamePrefix | `item` | No | table name prefix. For Migration from MySQL Persistence, set to `Item`. |
+| tableUseRealItemNames | `false` | No | table name prefix generation. When set to `true`, real item names are used for table names and `tableNamePrefix` is ignored. When set to `false`, the `tableNamePrefix` is used to generate table names with sequential numbers. |
+| tableIdDigitCount | 4 | No | when `tableUseRealItemNames` is `false` and thus table names are generated sequentially, this controls how many zero-padded digits are used in the table name. With the default of 4, the first table name will end with `0001`. For migration from the MySQL persistence service, set this to 0. |
+| rebuildTableNames | false | No | rename existing tables using `tableUseRealItemNames` and `tableIdDigitCount`. USE WITH CARE! Deactivate after Renaming is done! |
+| jdbc.maximumPoolSize | configured per database in package `org.openhab.persistence.jdbc.db.*` | No | Some embeded databases can handle only one connection. See [this link](https://github.com/brettwooldridge/HikariCP/issues/256) for more information |
+| jdbc.minimumIdle | see above | No | see above |
+| enableLogTime | `false` | No | timekeeping |
+
+All item- and event-related configuration is done in the file `persistence/jdbc.persist`.
+
+To configure this service as the default persistence service for openHAB 2, add or change the line
+
+```
+org.openhab.core.persistence:default=jdbc
+```
+
+in the file `services/runtime.cfg`.
+
+### Minimal Configuration
+
+services/jdbc.cfg
+
+```
+url=jdbc:postgresql://192.168.0.1:5432/testPostgresql
+```
+
+### Migration from MySQL to JDBC Persistence Services
+
+The JDBC Persistence service can act as a replacement for the MySQL Persistence service.
+Here is an example of a configuration for a MySQL database named `testMysql` with user `test` and password `test`:
+
+services/jdbc.cfg
+
+```
+url=jdbc:mysql://192.168.0.1:3306/testMysql
+user=test
+password=test
+tableNamePrefix=Item
+tableUseRealItemNames=false
+tableIdDigitCount=0
+```
+
+Remember to install and uninstall the services you want, and rename `persistence/mysql.persist` to `persistence/jdbc.persist`.
+
+## Technical Notes
+
+### Database Table Schema
+
+The table name schema can be reconfigured after creation, if needed.
+
+The service will create a mapping table to link each item to a table, and a separate table is generated for each item.
+The item data tables include time and data values.
+The SQL data type used depends on the openHAB item type, and allows the item state to be recovered back into openHAB in the same way it was stored.
+
+With this *per-item* layout, the scalability and easy maintenance of the database is ensured, even if large amounts of data must be managed.
+To rename existing tables, use the parameters `tableUseRealItemNames` and `tableIdDigitCount` in the configuration.
+
+### Number Precision
+
+Default openHAB number items are persisted with SQL datatype `double`.
+Internally openHAB uses `BigDecimal`.
+If better numerical precision is needed, for example set `sqltype.NUMBER = DECIMAL(max digits, max decimals)`, then on the Java side, the service works with `BigDecimal` without type conversion.
+If more come decimals as `max decimals` provides, this persisted value is rounded mathematically correctly.
+The SQL types `DECIMAL` or `NUMERIC` are precise, but to work with `DOUBLE` is faster.
+
+### Rounding results
+
+The results of database queries of number items are rounded to three decimal places by default.
+With `numberDecimalcount` decimals can be changed.
+Especially if sql types `DECIMAL` or `NUMERIC` are used for `sqltype.NUMBER`, rounding can be disabled by setting `numberDecimalcount=-1`.
+
+### For Developers
+
+* Clearly separated source files for the database-specific part of openHAB logic.
+* Code duplication by similar services is prevented.
+* Integrating a new SQL and JDBC enabled database is fairly simple.
+
+### Performance Tests
+
+Not necessarily representative of the performance you may experience.
+
+| DATABASE | FIRST RUN | AVERAGE | FASTEST | SIZE AFTER | COMMENT |
+| ---------- | --------: | ------: | ------: | ---------: | -------------- |
+| Derby | 7.829 | 6.892 | 5.381 | 5.36 MB | local embedded |
+| H2 | 1.797 | 2.080 | 1.580 | 0.96 MB | local embedded |
+| hsqldb | 3.474 | 2.104 | 1.310 | 1.23 MB | local embedded |
+| mysql | 11.873 | 11.524 | 10.971 | - | ext. Server VM |
+| postgresql | 8.147 | 7.072 | 6.895 | - | ext. Server VM |
+| sqlite | 2.406 | 1.249 | 1.137 | 0.28 MB | local embedded |
+
+* Each test ran about 20 Times every 30 seconds.
+* openHAB 1.x has ready started for about a Minute.
+* the data in seconds for the evaluation are from the console output.
+
+Used a script like this:
+
+```
+var count = 0;
+rule "DB STRESS TEST"
+when
+ Time cron "30 * * * * ?"
+then
+ if( count = 24) count = 0
+ count = count+1
+ if( count > 3 && count < 23){
+ for( var i=500; i>1; i=i-1){
+ postUpdate( NUMBERITEM, i)
+ SWITCHITEM.previousState().state
+ postUpdate( DIMMERITEM, OFF)
+ NUMBERITEM.changedSince( now().minusMinutes(1))
+ postUpdate( DIMMERITEM, ON)
+ }
+ }
+end
+```
+
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.jdbc</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: JDBC</name>
+
+ <properties>
+ <bnd.importpackage>!org.osgi.service.jdbc.*,!sun.security.*,!org.apache.lucene.*,!org.apache.logging.log4j,!waffle.windows.auth.*,!org.hibernate.*,!org.jboss.*,!org.codehaus.groovy.*,!com.codahale.metrics.*,!com.google.protobuf.*,!com.ibm.icu.*,!com.ibm.jvm.*,!com.mchange.*,!com.sun.*,!com.vividsolutions.*,!io.prometheus.*,com.mysql.jdbc;resolution:=optional,org.apache.derby.*;resolution:=optional,org.h2;resolution:=optional,org.h2.jdbcx;resolution:=optional,org.hsqldb;resolution:=optional,org.hsqldb.jdbc;resolution:=optional,org.mariadb.jdbc;resolution:=optional,org.postgresql;resolution:=optional,org.sqlite;resolution:=optional,org.sqlite.jdbc4;resolution:=optional,javassist*;resolution:=optional</bnd.importpackage>
+ <dep.noembedding>derby,h2,hsqldb,mariadb-java-client,mysql-connector-java,postgresql,sqlite-jdbc</dep.noembedding>
+
+ <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
+ <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
+ <hikari.version>2.4.7</hikari.version>
+ <dbutils.version>1.6</dbutils.version>
+ <yank.version>3.2.0</yank.version>
+
+ <!-- JDBC database driver versions -->
+ <derby.version>10.12.1.1</derby.version>
+ <h2.version>1.4.191</h2.version>
+ <hsqldb.version>2.3.3</hsqldb.version>
+ <mariadb.version>1.3.5</mariadb.version>
+ <mysql.version>8.0.13</mysql.version>
+ <postgresql.version>9.4.1212</postgresql.version>
+ <sqlite.version>3.16.1</sqlite.version>
+ </properties>
+
+ <dependencies>
+ <dependency>
+ <groupId>commons-dbutils</groupId>
+ <artifactId>commons-dbutils</artifactId>
+ <version>${dbutils.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.zaxxer</groupId>
+ <artifactId>HikariCP</artifactId>
+ <version>${hikari.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.knowm</groupId>
+ <artifactId>yank</artifactId>
+ <version>${yank.version}</version>
+ </dependency>
+
+ <!-- DB dependencies -->
+ <dependency>
+ <groupId>org.apache.derby</groupId>
+ <artifactId>derby</artifactId>
+ <version>${derby.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>com.h2database</groupId>
+ <artifactId>h2</artifactId>
+ <version>${h2.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.hsqldb</groupId>
+ <artifactId>hsqldb</artifactId>
+ <version>${hsqldb.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.mariadb.jdbc</groupId>
+ <artifactId>mariadb-java-client</artifactId>
+ <version>${mariadb.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>mysql</groupId>
+ <artifactId>mysql-connector-java</artifactId>
+ <version>${mysql.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.postgresql</groupId>
+ <artifactId>postgresql</artifactId>
+ <version>${postgresql.version}</version>
+ </dependency>
+ <dependency>
+ <groupId>org.xerial</groupId>
+ <artifactId>sqlite-jdbc</artifactId>
+ <version>${sqlite.version}</version>
+ </dependency>
+
+ </dependencies>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.jdbc-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <!-- JDBC Persistence for: Apache Derby, H2, HSQLDB, MariaDB, MySQL, PostgreSQL, SQLite -->
+ <feature name="openhab-persistence-jdbc-derby" description="JDBC Persistence Apache Derby" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.apache.derby/derbyclient/${derby.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+ <feature name="openhab-persistence-jdbc-h2" description="JDBC Persistence H2" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:com.h2database/h2/${h2.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+ <feature name="openhab-persistence-jdbc-hsqldb" description="JDBC Persistence HSQLDB" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.hsqldb/hsqldb/${hsqldb.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+ <feature name="openhab-persistence-jdbc-mariadb" description="JDBC Persistence MariaDB" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.mariadb.jdbc/mariadb-java-client/${mariadb.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+ <feature name="openhab-persistence-jdbc-mysql" description="JDBC Persistence MySQL" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:mysql/mysql-connector-java/${mysql.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+ <feature name="openhab-persistence-jdbc-postgresql" description="JDBC Persistence PostgreSQL" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.postgresql/postgresql/${postgresql.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+ <feature name="openhab-persistence-jdbc-sqlite" description="JDBC Persistence SQLite" version="${project.version}">
+ <configfile finalname="${openhab.conf}/services/jdbc.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jdbc</configfile>
+ <feature prerequisite="false" dependency="false">openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.xerial/sqlite-jdbc/${sqlite.version}</bundle>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jdbc/${project.version}</bundle>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import java.math.BigDecimal;
+import java.sql.Timestamp;
+import java.time.Instant;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.time.format.DateTimeFormatter;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Properties;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.StringItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcHistoricItem;
+import org.openhab.persistence.jdbc.utils.DbMetaData;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Default Database Configuration class.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcBaseDAO.class);
+
+ public Properties databaseProps = new Properties();
+ protected String urlSuffix = "";
+ public Map<String, String> sqlTypes = new HashMap<>();
+
+ // Get Database Meta data
+ protected DbMetaData dbMeta;
+
+ protected String sqlPingDB;
+ protected String sqlGetDB;
+ protected String sqlIfTableExists;
+ protected String sqlCreateNewEntryInItemsTable;
+ protected String sqlCreateItemsTableIfNot;
+ protected String sqlDeleteItemsEntry;
+ protected String sqlGetItemIDTableNames;
+ protected String sqlGetItemTables;
+ protected String sqlCreateItemTable;
+ protected String sqlInsertItemValue;
+
+ /********
+ * INIT *
+ ********/
+ public JdbcBaseDAO() {
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ /**
+ * ## Get high precision by fractal seconds, examples ##
+ *
+ * mysql > 5.5 + mariadb > 5.2:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP(3), value TIMESTAMP(3));
+ * INSERT INTO FractionalSeconds (time, value) VALUES( NOW(3), '1999-01-09 20:11:11.126' );
+ * SELECT time FROM FractionalSeconds ORDER BY time DESC LIMIT 1;
+ *
+ * mysql <= 5.5 + mariadb <= 5.2: !!! NO high precision and fractal seconds !!!
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( NOW(), '1999-01-09 20:11:11.126' );
+ * SELECT time FROM FractionalSeconds ORDER BY time DESC LIMIT 1;
+ *
+ * derby:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( CURRENT_TIMESTAMP, '1999-01-09 20:11:11.126' );
+ * SELECT time, value FROM FractionalSeconds;
+ *
+ * H2 + postgreSQL + hsqldb:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( NOW(), '1999-01-09 20:11:11.126' );
+ * SELECT time, value FROM FractionalSeconds;
+ *
+ * Sqlite:
+ * DROP TABLE FractionalSeconds;
+ * CREATE TABLE FractionalSeconds (time TIMESTAMP, value TIMESTAMP);
+ * INSERT INTO FractionalSeconds (time, value) VALUES( strftime('%Y-%m-%d %H:%M:%f' , 'now' , 'localtime'),
+ * '1999-01-09 20:11:11.124' );
+ * SELECT time FROM FractionalSeconds ORDER BY time DESC LIMIT 1;
+ *
+ */
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ sqlPingDB = "SELECT 1";
+ sqlGetDB = "SELECT DATABASE()";
+ sqlIfTableExists = "SHOW TABLES LIKE '#searchTable#'";
+
+ sqlCreateNewEntryInItemsTable = "INSERT INTO #itemsManageTable# (ItemName) VALUES ('#itemname#')";
+ sqlCreateItemsTableIfNot = "CREATE TABLE IF NOT EXISTS #itemsManageTable# (ItemId INT NOT NULL AUTO_INCREMENT,#colname# #coltype# NOT NULL,PRIMARY KEY (ItemId))";
+ sqlDeleteItemsEntry = "DELETE FROM items WHERE ItemName=#itemname#";
+ sqlGetItemIDTableNames = "SELECT itemid, itemname FROM #itemsManageTable#";
+ sqlGetItemTables = "SELECT table_name FROM information_schema.tables WHERE table_type='BASE TABLE' AND table_schema='#jdbcUriDatabaseName#' AND NOT table_name='#itemsManageTable#'";
+ sqlCreateItemTable = "CREATE TABLE IF NOT EXISTS #tableName# (time #tablePrimaryKey# NOT NULL, value #dbType#, PRIMARY KEY(time))";
+ sqlInsertItemValue = "INSERT INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, ? ) ON DUPLICATE KEY UPDATE VALUE= ?";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ sqlTypes.put("CALLITEM", "VARCHAR(200)");
+ sqlTypes.put("COLORITEM", "VARCHAR(70)");
+ sqlTypes.put("CONTACTITEM", "VARCHAR(6)");
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP");
+ sqlTypes.put("DIMMERITEM", "TINYINT");
+ sqlTypes.put("LOCATIONITEM", "VARCHAR(30)");
+ sqlTypes.put("NUMBERITEM", "DOUBLE");
+ sqlTypes.put("ROLLERSHUTTERITEM", "TINYINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR(65500)");// jdbc max 21845
+ sqlTypes.put("SWITCHITEM", "VARCHAR(6)");
+ sqlTypes.put("tablePrimaryKey", "TIMESTAMP");
+ sqlTypes.put("tablePrimaryValue", "NOW()");
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ *
+ * driverClassName (used with jdbcUrl):
+ * Derby: org.apache.derby.jdbc.EmbeddedDriver
+ * H2: org.h2.Driver
+ * HSQLDB: org.hsqldb.jdbcDriver
+ * Jaybird: org.firebirdsql.jdbc.FBDriver
+ * MariaDB: org.mariadb.jdbc.Driver
+ * MySQL: com.mysql.jdbc.Driver
+ * MaxDB: com.sap.dbtech.jdbc.DriverSapDB
+ * PostgreSQL: org.postgresql.Driver
+ * SyBase: com.sybase.jdbc3.jdbc.SybDriver
+ * SqLite: org.sqlite.JDBC
+ *
+ * dataSourceClassName (for alternative Configuration):
+ * Derby: org.apache.derby.jdbc.ClientDataSource
+ * H2: org.h2.jdbcx.JdbcDataSource
+ * HSQLDB: org.hsqldb.jdbc.JDBCDataSource
+ * Jaybird: org.firebirdsql.pool.FBSimpleDataSource
+ * MariaDB, MySQL: org.mariadb.jdbc.MySQLDataSource
+ * MaxDB: com.sap.dbtech.jdbc.DriverSapDB
+ * PostgreSQL: org.postgresql.ds.PGSimpleDataSource
+ * SyBase: com.sybase.jdbc4.jdbc.SybDataSource
+ * SqLite: org.sqlite.SQLiteDataSource
+ *
+ * HikariPool - configuration Example:
+ * allowPoolSuspension.............false
+ * autoCommit......................true
+ * catalog.........................
+ * connectionInitSql...............
+ * connectionTestQuery.............
+ * connectionTimeout...............30000
+ * dataSource......................
+ * dataSourceClassName.............
+ * dataSourceJNDI..................
+ * dataSourceProperties............{password=<masked>}
+ * driverClassName.................
+ * healthCheckProperties...........{}
+ * healthCheckRegistry.............
+ * idleTimeout.....................600000
+ * initializationFailFast..........true
+ * isolateInternalQueries..........false
+ * jdbc4ConnectionTest.............false
+ * jdbcUrl.........................jdbc:mysql://192.168.0.1:3306/test
+ * leakDetectionThreshold..........0
+ * maxLifetime.....................1800000
+ * maximumPoolSize.................10
+ * metricRegistry..................
+ * metricsTrackerFactory...........
+ * minimumIdle.....................10
+ * password........................<masked>
+ * poolName........................HikariPool-0
+ * readOnly........................false
+ * registerMbeans..................false
+ * scheduledExecutorService........
+ * threadFactory...................
+ * transactionIsolation............
+ * username........................xxxx
+ * validationTimeout...............5000
+ */
+ private void initDbProps() {
+ // databaseProps.setProperty("dataSource.url", "jdbc:mysql://192.168.0.1:3306/test");
+ // databaseProps.setProperty("dataSource.user", "test");
+ // databaseProps.setProperty("dataSource.password", "test");
+
+ // Most relevant Performance values
+ // maximumPoolSize to 20, minimumIdle to 5, and idleTimeout to 2 minutes.
+ // databaseProps.setProperty("maximumPoolSize", ""+maximumPoolSize);
+ // databaseProps.setProperty("minimumIdle", ""+minimumIdle);
+ // databaseProps.setProperty("idleTimeout", ""+idleTimeout);
+ // databaseProps.setProperty("connectionTimeout",""+connectionTimeout);
+ // databaseProps.setProperty("idleTimeout", ""+idleTimeout);
+ // databaseProps.setProperty("maxLifetime", ""+maxLifetime);
+ // databaseProps.setProperty("validationTimeout",""+validationTimeout);
+ }
+
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ // Initialize sqlTypes, depending on DB version for example
+ dbMeta = new DbMetaData();// get DB information
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ public Integer doPingDB() {
+ return Yank.queryScalar(sqlPingDB, Integer.class, null);
+ }
+
+ public String doGetDB() {
+ return Yank.queryScalar(sqlGetDB, String.class, null);
+ }
+
+ public boolean doIfTableExists(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlIfTableExists, new String[] { "#searchTable#" },
+ new String[] { vo.getItemsManageTable() });
+ logger.debug("JDBC::doIfTableExists sql={}", sql);
+ return Yank.queryScalar(sql, String.class, null) != null;
+ }
+
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateNewEntryInItemsTable,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemsTableIfNot,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ public void doDeleteItemsEntry(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlDeleteItemsEntry, new String[] { "#itemname#" },
+ new String[] { vo.getItemname() });
+ logger.debug("JDBC::doDeleteItemsEntry sql={}", sql);
+ Yank.execute(sql, null);
+ }
+
+ public List<ItemsVO> doGetItemIDTableNames(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlGetItemIDTableNames, new String[] { "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable() });
+ logger.debug("JDBC::doGetItemIDTableNames sql={}", sql);
+ return Yank.queryBeanList(sql, ItemsVO.class, null);
+ }
+
+ public List<ItemsVO> doGetItemTables(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlGetItemTables,
+ new String[] { "#jdbcUriDatabaseName#", "#itemsManageTable#" },
+ new String[] { vo.getJdbcUriDatabaseName(), vo.getItemsManageTable() });
+ logger.debug("JDBC::doGetItemTables sql={}", sql);
+ return Yank.queryBeanList(sql, ItemsVO.class, null);
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ public void doUpdateItemTableNames(List<ItemVO> vol) {
+ if (!vol.isEmpty()) {
+ String sql = updateItemTableNamesProvider(vol);
+ Yank.execute(sql, null);
+ }
+ }
+
+ public void doCreateItemTable(ItemVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemTable,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryKey#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryKey") });
+ logger.debug("JDBC::doCreateItemTable sql={}", sql);
+ Yank.execute(sql, null);
+ }
+
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(sqlInsertItemValue,
+ new String[] { "#tableName#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue(), vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ public List<HistoricItem> doGetHistItemFilterQuery(Item item, FilterCriteria filter, int numberDecimalcount,
+ String table, String name) {
+ String sql = histItemFilterQueryProvider(filter, numberDecimalcount, table, name);
+ logger.debug("JDBC::doGetHistItemFilterQuery sql={}", sql);
+ List<Object[]> m = Yank.queryObjectArrays(sql, null);
+
+ List<HistoricItem> items = new ArrayList<>();
+ for (int i = 0; i < m.size(); i++) {
+ items.add(new JdbcHistoricItem(item.getName(), getState(item, m.get(i)[1]), objectAsDate(m.get(i)[0])));
+ }
+ return items;
+ }
+
+ /*************
+ * Providers *
+ *************/
+ static final DateTimeFormatter JDBC_DATE_FORMAT = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
+
+ private String histItemFilterQueryProvider(FilterCriteria filter, int numberDecimalcount, String table,
+ String simpleName) {
+ logger.debug(
+ "JDBC::getHistItemFilterQueryProvider filter = {}, numberDecimalcount = {}, table = {}, simpleName = {}",
+ filter.toString(), numberDecimalcount, table, simpleName);
+
+ String filterString = "";
+ if (filter.getBeginDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME>'" + JDBC_DATE_FORMAT.format(filter.getBeginDate()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME<'" + JDBC_DATE_FORMAT.format(filter.getEndDate()) + "'";
+ }
+ filterString += (filter.getOrdering() == Ordering.ASCENDING) ? " ORDER BY time ASC" : " ORDER BY time DESC ";
+ if (filter.getPageSize() != 0x7fffffff) {
+ filterString += " LIMIT " + filter.getPageNumber() * filter.getPageSize() + "," + filter.getPageSize();
+ }
+ // SELECT time, ROUND(value,3) FROM number_item_0114 ORDER BY time DESC LIMIT 0,1
+ // rounding HALF UP
+ String queryString = "NUMBERITEM".equalsIgnoreCase(simpleName) && numberDecimalcount > -1
+ ? "SELECT time, ROUND(value," + numberDecimalcount + ") FROM " + table
+ : "SELECT time, value FROM " + table;
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ private String updateItemTableNamesProvider(List<ItemVO> namesList) {
+ logger.debug("JDBC::updateItemTableNamesProvider namesList.size = {}", namesList.size());
+ String queryString = "";
+ for (int i = 0; i < namesList.size(); i++) {
+ ItemVO it = namesList.get(i);
+ queryString += "ALTER TABLE " + it.getTableName() + " RENAME TO " + it.getNewTableName() + ";";
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ protected ItemVO storeItemValueProvider(Item item, ItemVO vo) {
+ String itemType = getItemType(item);
+
+ logger.debug("JDBC::storeItemValueProvider: item '{}' as Type '{}' in '{}' with state '{}'", item.getName(),
+ itemType, vo.getTableName(), item.getState().toString());
+
+ // insertItemValue
+ logger.debug("JDBC::storeItemValueProvider: getState: '{}'", item.getState().toString());
+ if ("COLORITEM".equals(itemType)) {
+ vo.setValueTypes(getSqlTypes().get(itemType), java.lang.String.class);
+ vo.setValue(item.getState().toString());
+ } else if ("NUMBERITEM".equals(itemType)) {
+ String it = getSqlTypes().get(itemType);
+ if (it.toUpperCase().contains("DOUBLE")) {
+ vo.setValueTypes(it, java.lang.Double.class);
+ Number newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.doubleValue: '{}'", newVal.doubleValue());
+ vo.setValue(newVal.doubleValue());
+ } else if (it.toUpperCase().contains("DECIMAL") || it.toUpperCase().contains("NUMERIC")) {
+ vo.setValueTypes(it, java.math.BigDecimal.class);
+ DecimalType newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.toBigDecimal: '{}'", newVal.toBigDecimal());
+ vo.setValue(newVal.toBigDecimal());
+ } else if (it.toUpperCase().contains("INT")) {
+ vo.setValueTypes(it, java.lang.Integer.class);
+ Number newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.intValue: '{}'", newVal.intValue());
+ vo.setValue(newVal.intValue());
+ } else {// fall back to String
+ vo.setValueTypes(it, java.lang.String.class);
+ logger.warn("JDBC::storeItemValueProvider: item.getState().toString(): '{}'",
+ item.getState().toString());
+ vo.setValue(item.getState().toString());
+ }
+ } else if ("ROLLERSHUTTERITEM".equals(itemType) || "DIMMERITEM".equals(itemType)) {
+ vo.setValueTypes(getSqlTypes().get(itemType), java.lang.Integer.class);
+ Number newVal = ((DecimalType) item.getState());
+ logger.debug("JDBC::storeItemValueProvider: newVal.intValue: '{}'", newVal.intValue());
+ vo.setValue(newVal.intValue());
+ } else if ("DATETIMEITEM".equals(itemType)) {
+ vo.setValueTypes(getSqlTypes().get(itemType), java.sql.Timestamp.class);
+ java.sql.Timestamp d = new java.sql.Timestamp(
+ ((DateTimeType) item.getState()).getZonedDateTime().toInstant().toEpochMilli());
+ logger.debug("JDBC::storeItemValueProvider: DateTimeItem: '{}'", d);
+ vo.setValue(d);
+ } else {
+ /*
+ * !!ATTENTION!!
+ *
+ * 1. DimmerItem.getStateAs(PercentType.class).toString() always
+ * returns 0
+ * RollershutterItem.getStateAs(PercentType.class).toString() works
+ * as expected
+ *
+ * 2. (item instanceof ColorItem) == (item instanceof DimmerItem) =
+ * true Therefore for instance tests ColorItem always has to be
+ * tested before DimmerItem
+ *
+ * !!ATTENTION!!
+ */
+ // All other items should return the best format by default
+ vo.setValueTypes(getSqlTypes().get(itemType), java.lang.String.class);
+ logger.debug("JDBC::storeItemValueProvider: other: item.getState().toString(): '{}'",
+ item.getState().toString());
+ vo.setValue(item.getState().toString());
+ }
+ return vo;
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+ protected State getState(Item item, Object v) {
+ String clazz = v.getClass().getSimpleName();
+ logger.debug("JDBC::ItemResultHandler::handleResult getState value = '{}', getClass = '{}', clazz = '{}'",
+ v.toString(), v.getClass(), clazz);
+ if (item instanceof NumberItem) {
+ String it = getSqlTypes().get("NUMBERITEM");
+ if (it.toUpperCase().contains("DOUBLE")) {
+ return new DecimalType(((Number) v).doubleValue());
+ } else if (it.toUpperCase().contains("DECIMAL") || it.toUpperCase().contains("NUMERIC")) {
+ return new DecimalType((BigDecimal) v);
+ } else if (it.toUpperCase().contains("INT")) {
+ return new DecimalType(((Integer) v).intValue());
+ }
+ return DecimalType.valueOf(((String) v).toString());
+ } else if (item instanceof ColorItem) {
+ return HSBType.valueOf(((String) v).toString());
+ } else if (item instanceof DimmerItem) {
+ return new PercentType(objectAsInteger(v));
+ } else if (item instanceof SwitchItem) {
+ return OnOffType.valueOf(((String) v).toString().trim());
+ } else if (item instanceof ContactItem) {
+ return OpenClosedType.valueOf(((String) v).toString().trim());
+ } else if (item instanceof RollershutterItem) {
+ return new PercentType(objectAsInteger(v));
+ } else if (item instanceof DateTimeItem) {
+ return new DateTimeType(
+ ZonedDateTime.ofInstant(Instant.ofEpochMilli(objectAsLong(v)), ZoneId.systemDefault()));
+ } else if (item instanceof StringItem) {
+ return StringType.valueOf(((String) v).toString());
+ } else {// Call, Location, String
+ return StringType.valueOf(((String) v).toString());
+ }
+ }
+
+ protected ZonedDateTime objectAsDate(Object v) {
+ if (v instanceof java.lang.String) {
+ return ZonedDateTime.ofInstant(Timestamp.valueOf(v.toString()).toInstant(), ZoneId.systemDefault());
+ }
+ return ZonedDateTime.ofInstant(((Timestamp) v).toInstant(), ZoneId.systemDefault());
+ }
+
+ protected Long objectAsLong(Object v) {
+ if (v instanceof Long) {
+ return ((Number) v).longValue();
+ } else if (v instanceof java.sql.Date) {
+ return ((java.sql.Date) v).getTime();
+ }
+ return ((java.sql.Timestamp) v).getTime();
+ }
+
+ protected Integer objectAsInteger(Object v) {
+ if (v instanceof Byte) {
+ return ((Byte) v).intValue();
+ }
+ return ((Integer) v).intValue();
+ }
+
+ public String getItemType(Item i) {
+ Item item = i;
+ String def = "STRINGITEM";
+ if (i instanceof GroupItem) {
+ item = ((GroupItem) i).getBaseItem();
+ if (item == null) {
+ // if GroupItem:<ItemType> is not defined in
+ // *.items using StringType
+ // logger.debug("JDBC: BaseItem GroupItem:<ItemType> is not
+ // defined in *.items searching for first Member and try to use
+ // as ItemType");
+ logger.debug(
+ "JDBC::getItemType: Cannot detect ItemType for {} because the GroupItems' base type isn't set in *.items File.",
+ i.getName());
+ item = ((GroupItem) i).getMembers().iterator().next();
+ if (item == null) {
+ logger.debug(
+ "JDBC::getItemType: No ItemType found for first Child-Member of GroupItem {}, use ItemType for STRINGITEM as Fallback",
+ i.getName());
+ return def;
+ }
+ }
+ }
+ String itemType = item.getClass().getSimpleName().toUpperCase();
+ logger.debug("JDBC::getItemType: Try to use ItemType {} for Item {}", itemType, i.getName());
+ if (sqlTypes.get(itemType) == null) {
+ logger.warn(
+ "JDBC::getItemType: No sqlType found for ItemType {}, use ItemType for STRINGITEM as Fallback for {}",
+ itemType, i.getName());
+ return def;
+ }
+ return itemType;
+ }
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+ public Map<String, String> getSqlTypes() {
+ return sqlTypes;
+ }
+
+ public String getDataType(Item item) {
+ return sqlTypes.get(getItemType(item));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import java.time.format.DateTimeFormatter;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcHistoricItem;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcDerbyDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcDerbyDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcDerbyDAO() {
+ super();
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ sqlPingDB = "values 1";
+ sqlGetDB = "VALUES SYSCS_UTIL.SYSCS_GET_DATABASE_PROPERTY( 'DataDictionaryVersion' )"; // returns version
+ sqlIfTableExists = "SELECT * FROM SYS.SYSTABLES WHERE TABLENAME='#searchTable#'";
+ sqlCreateItemsTableIfNot = "CREATE TABLE #itemsManageTable# ( ItemId INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1), #colname# #coltype# NOT NULL)";
+ sqlCreateItemTable = "CREATE TABLE #tableName# (time #tablePrimaryKey# NOT NULL, value #dbType#, PRIMARY KEY(time))";
+ // Prevent error against duplicate time value (seldom): No powerful Merge found:
+ // http://www.codeproject.com/Questions/162627/how-to-insert-new-record-in-my-table-if-not-exists
+ sqlInsertItemValue = "INSERT INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ private void initSqlTypes() {
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP");
+ sqlTypes.put("DIMMERITEM", "SMALLINT");
+ sqlTypes.put("ROLLERSHUTTERITEM", "SMALLINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR(32000)");
+ sqlTypes.put("tablePrimaryValue", "CURRENT_TIMESTAMP");
+ logger.debug("JDBC::initSqlTypes: Initialized the type array sqlTypes={}", sqlTypes.values());
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Properties for HikariCP
+ // Use driverClassName
+ databaseProps.setProperty("driverClassName", "org.apache.derby.jdbc.EmbeddedDriver");
+ // OR dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.apache.derby.jdbc.EmbeddedDataSource");
+ databaseProps.setProperty("maximumPoolSize", "1");
+ databaseProps.setProperty("minimumIdle", "1");
+ }
+
+ @Override
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ // Initialize sqlTypes, depending on DB version for example
+ // derby does not like this... dbMeta = new DbMetaData();// get DB information
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(sqlPingDB, Integer.class, null);
+ }
+
+ @Override
+ public boolean doIfTableExists(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlIfTableExists, new String[] { "#searchTable#" },
+ new String[] { vo.getItemsManageTable().toUpperCase() });
+ logger.debug("JDBC::doIfTableExists sql={}", sql);
+ return Yank.queryScalar(sql, String.class, null) != null;
+ }
+
+ @Override
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateNewEntryInItemsTable,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable().toUpperCase(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ // boolean tableExists = Yank.queryScalar(SQL_IF_TABLE_EXISTS.replace("#searchTable#",
+ // vo.getItemsManageTable().toUpperCase()), String.class, null) == null;
+ boolean tableExists = doIfTableExists(vo);
+ if (!tableExists) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemsTableIfNot,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#" },
+ new String[] { vo.getItemsManageTable().toUpperCase(), vo.getColname(), vo.getColtype() });
+ logger.debug("JDBC::doCreateItemsTableIfNot tableExists={} therefore sql={}", tableExists, sql);
+ Yank.execute(sql, null);
+ } else {
+ logger.debug("JDBC::doCreateItemsTableIfNot tableExists={}, did not CREATE TABLE", tableExists);
+ }
+ return vo;
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doCreateItemTable(ItemVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemTable,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryKey#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryKey") });
+ Yank.execute(sql, null);
+ }
+
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(sqlInsertItemValue,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName().toUpperCase(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ @Override
+ public List<HistoricItem> doGetHistItemFilterQuery(Item item, FilterCriteria filter, int numberDecimalcount,
+ String table, String name) {
+ String sql = histItemFilterQueryProvider(filter, numberDecimalcount, table, name);
+ List<Object[]> m = Yank.queryObjectArrays(sql, null);
+
+ logger.debug("JDBC::doGetHistItemFilterQuery got Array length={}", m.size());
+
+ List<HistoricItem> items = new ArrayList<>();
+ for (int i = 0; i < m.size(); i++) {
+ logger.debug("JDBC::doGetHistItemFilterQuery 0='{}' 1='{}'", m.get(i)[0], m.get(i)[1]);
+ items.add(new JdbcHistoricItem(item.getName(), getState(item, m.get(i)[1]), objectAsDate(m.get(i)[0])));
+ }
+ return items;
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+ static final DateTimeFormatter JDBC_DATE_FORMAT = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
+
+ /**
+ * @param filter
+ * @param numberDecimalcount
+ * @param table
+ * @return
+ */
+ private String histItemFilterQueryProvider(FilterCriteria filter, int numberDecimalcount, String table,
+ String simpleName) {
+ logger.debug(
+ "JDBC::getHistItemFilterQueryProvider filter = {}, numberDecimalcount = {}, table = {}, simpleName = {}",
+ StringUtilsExt.filterToString(filter), numberDecimalcount, table, simpleName);
+
+ String filterString = "";
+ if (filter.getBeginDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME>'" + JDBC_DATE_FORMAT.format(filter.getBeginDate()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME<'" + JDBC_DATE_FORMAT.format(filter.getEndDate()) + "'";
+ }
+ filterString += (filter.getOrdering() == Ordering.ASCENDING) ? " ORDER BY time ASC" : " ORDER BY time DESC";
+ if (filter.getPageSize() != 0x7fffffff) {
+ // TODO: TESTING!!!
+ // filterString += " LIMIT " + filter.getPageNumber() *
+ // filter.getPageSize() + "," + filter.getPageSize();
+ // SELECT time, value FROM ohscriptfiles_sw_ace_paths_0001 ORDER BY
+ // time DESC OFFSET 1 ROWS FETCH NEXT 0 ROWS ONLY
+ // filterString += " OFFSET " + filter.getPageSize() +" ROWS FETCH
+ // FIRST||NEXT " + filter.getPageNumber() * filter.getPageSize() + "
+ // ROWS ONLY";
+ filterString += " OFFSET " + filter.getPageSize() + " ROWS FETCH FIRST "
+ + (filter.getPageNumber() * filter.getPageSize() + 1) + " ROWS ONLY";
+ }
+
+ // http://www.seemoredata.com/en/showthread.php?132-Round-function-in-Apache-Derby
+ // simulated round function in Derby: CAST(value 0.0005 AS DECIMAL(15,3))
+ // simulated round function in Derby: "CAST(value 0.0005 AS DECIMAL(15,"+numberDecimalcount+"))"
+
+ String queryString = "SELECT time,";
+ if ("NUMBERITEM".equalsIgnoreCase(simpleName) && numberDecimalcount > -1) {
+ // rounding HALF UP
+ queryString += "CAST(value 0.";
+ for (int i = 0; i < numberDecimalcount; i++) {
+ queryString += "0";
+ }
+ queryString += "5 AS DECIMAL(31," + numberDecimalcount + "))"; // 31 is DECIMAL max precision
+ // https://db.apache.org/derby/docs/10.0/manuals/develop/develop151.html
+ } else {
+ queryString += " value FROM " + table.toUpperCase();
+ }
+
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcH2DAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcH2DAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcH2DAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ sqlIfTableExists = "SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME='#searchTable#'";
+ // SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( NOW(), CAST( ? as #dbType#) )";
+ // http://stackoverflow.com/questions/19768051/h2-sql-database-insert-if-the-record-does-not-exist
+ sqlInsertItemValue = "MERGE INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.h2.Driver");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.h2.jdbcx.JdbcDataSource");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(sqlInsertItemValue,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcHsqldbDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcHsqldbDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcHsqldbDAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ // http://hsqldb.org/doc/guide/builtinfunctions-chapt.html
+ sqlPingDB = "SELECT 1 FROM INFORMATION_SCHEMA.SYSTEM_USERS";
+ sqlGetDB = "SELECT DATABASE () FROM INFORMATION_SCHEMA.SYSTEM_USERS";
+ sqlIfTableExists = "SELECT * FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE TABLE_NAME='#searchTable#'";
+ sqlCreateItemsTableIfNot = "CREATE TABLE IF NOT EXISTS #itemsManageTable# ( ItemId INT GENERATED BY DEFAULT AS IDENTITY (START WITH 1, INCREMENT BY 1) NOT NULL, #colname# #coltype# NOT NULL)";
+ sqlCreateNewEntryInItemsTable = "INSERT INTO #itemsManageTable# (ItemName) VALUES ('#itemname#')";
+ // Prevent error against duplicate time value
+ // http://hsqldb.org/doc/guide/dataaccess-chapt.html#dac_merge_statement
+ // SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( NOW(), CAST( ? as #dbType#) )";
+ sqlInsertItemValue = "MERGE INTO #tableName# "
+ + "USING (VALUES #tablePrimaryValue#, CAST( ? as #dbType#)) temp (TIME, VALUE) ON (#tableName#.TIME=temp.TIME) "
+ + "WHEN NOT MATCHED THEN INSERT (TIME, VALUE) VALUES (temp.TIME, temp.VALUE)";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.hsqldb.jdbcDriver");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(sqlPingDB, Integer.class, null);
+ }
+
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemsTableIfNot,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#", "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype(), vo.getItemsManageTable() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ @Override
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateNewEntryInItemsTable,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(sqlInsertItemValue,
+ new String[] { "#tableName#", "#dbType#", "#tableName#", "#tablePrimaryValue#" }, new String[] {
+ vo.getTableName(), vo.getDbType(), vo.getTableName(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.persistence.jdbc.utils.DbMetaData;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcMariadbDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcMariadbDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcMariadbDAO() {
+ super();
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Performancetuning
+ databaseProps.setProperty("dataSource.cachePrepStmts", "true");
+ databaseProps.setProperty("dataSource.prepStmtCacheSize", "250");
+ databaseProps.setProperty("dataSource.prepStmtCacheSqlLimit", "2048");
+ databaseProps.setProperty("dataSource.jdbcCompliantTruncation", "false");// jdbc standard max varchar max length
+ // of 21845
+
+ // Properties for HikariCP
+ // Use driverClassName
+ databaseProps.setProperty("driverClassName", "org.mariadb.jdbc.Driver");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.mariadb.jdbc.MySQLDataSource");
+ databaseProps.setProperty("maximumPoolSize", "3");
+ databaseProps.setProperty("minimumIdle", "2");
+ }
+
+ @Override
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ dbMeta = new DbMetaData();
+ // Initialize sqlTypes, depending on DB version for example
+ if (dbMeta.isDbVersionGreater(5, 1)) {
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryKey", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryValue", "NOW(3)");
+ }
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(sqlPingDB, Long.class, null).intValue();
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.persistence.jdbc.utils.DbMetaData;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * since driver version >= 6.0 sometimes timezone conversation is needed: ?serverTimezone=UTC
+ * example: dbProps.setProperty("jdbcUrl", "jdbc:mysql://192.168.0.181:3306/ItemTypeTest3?serverTimezone=UTC");//mysql
+ * 5.7
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcMysqlDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcMysqlDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcMysqlDAO() {
+ super();
+ initSqlTypes();
+ initDbProps();
+ initSqlQueries();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ sqlTypes.put("STRINGITEM", "VARCHAR(21717)");// mysql using utf-8 max 65535/3 = 21845, using 21845-128 = 21717
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Performancetuning
+ databaseProps.setProperty("dataSource.cachePrepStmts", "true");
+ databaseProps.setProperty("dataSource.prepStmtCacheSize", "250");
+ databaseProps.setProperty("dataSource.prepStmtCacheSqlLimit", "2048");
+ databaseProps.setProperty("dataSource.jdbcCompliantTruncation", "false");// jdbc standard max varchar max length
+ // of 21845
+
+ // Properties for HikariCP
+ // Use driverClassName
+ databaseProps.setProperty("driverClassName", "com.mysql.jdbc.Driver");
+ // OR dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "com.mysql.jdbc.jdbc2.optional.MysqlDataSource");
+ databaseProps.setProperty("maximumPoolSize", "3");
+ databaseProps.setProperty("minimumIdle", "2");
+ }
+
+ @Override
+ public void initAfterFirstDbConnection() {
+ logger.debug("JDBC::initAfterFirstDbConnection: Initializing step, after db is connected.");
+ dbMeta = new DbMetaData();
+ // Initialize sqlTypes, depending on DB version for example
+ if (dbMeta.isDbVersionGreater(5, 5)) {
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryKey", "TIMESTAMP(3)");
+ sqlTypes.put("tablePrimaryValue", "NOW(3)");
+ }
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public Integer doPingDB() {
+ return Yank.queryScalar(sqlPingDB, Long.class, null).intValue();
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import java.time.format.DateTimeFormatter;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcHistoricItem;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcPostgresqlDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcPostgresqlDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcPostgresqlDAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ // System Information Functions: https://www.postgresql.org/docs/9.2/static/functions-info.html
+ sqlGetDB = "SELECT CURRENT_DATABASE()";
+ sqlIfTableExists = "SELECT * FROM PG_TABLES WHERE TABLENAME='#searchTable#'";
+ sqlCreateItemsTableIfNot = "CREATE TABLE IF NOT EXISTS #itemsManageTable# (itemid SERIAL NOT NULL, #colname# #coltype# NOT NULL, CONSTRAINT #itemsManageTable#_pkey PRIMARY KEY (itemid))";
+ sqlCreateNewEntryInItemsTable = "INSERT INTO items (itemname) SELECT itemname FROM #itemsManageTable# UNION VALUES ('#itemname#') EXCEPT SELECT itemname FROM items";
+ sqlGetItemTables = "SELECT table_name FROM information_schema.tables WHERE table_type='BASE TABLE' AND table_schema='public' AND NOT table_name='#itemsManageTable#'";
+ // http://stackoverflow.com/questions/17267417/how-do-i-do-an-upsert-merge-insert-on-duplicate-update-in-postgresql
+ // for later use, PostgreSql > 9.5 to prevent PRIMARY key violation use:
+ // SQL_INSERT_ITEM_VALUE = "INSERT INTO #tableName# (TIME, VALUE) VALUES( NOW(), CAST( ? as #dbType#) ) ON
+ // CONFLICT DO NOTHING";
+ sqlInsertItemValue = "INSERT INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ // Initialize the type array
+ sqlTypes.put("CALLITEM", "VARCHAR");
+ sqlTypes.put("COLORITEM", "VARCHAR");
+ sqlTypes.put("CONTACTITEM", "VARCHAR");
+ sqlTypes.put("DATETIMEITEM", "TIMESTAMP");
+ sqlTypes.put("DIMMERITEM", "SMALLINT");
+ sqlTypes.put("LOCATIONITEM", "VARCHAR");
+ sqlTypes.put("NUMBERITEM", "DOUBLE PRECISION");
+ sqlTypes.put("ROLLERSHUTTERITEM", "SMALLINT");
+ sqlTypes.put("STRINGITEM", "VARCHAR");
+ sqlTypes.put("SWITCHITEM", "VARCHAR");
+ logger.debug("JDBC::initSqlTypes: Initialized the type array sqlTypes={}", sqlTypes.values());
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Performance:
+ // databaseProps.setProperty("dataSource.cachePrepStmts", "true");
+ // databaseProps.setProperty("dataSource.prepStmtCacheSize", "250");
+ // databaseProps.setProperty("dataSource.prepStmtCacheSqlLimit", "2048");
+
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.postgresql.Driver");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.postgresql.ds.PGSimpleDataSource");
+ // databaseProps.setProperty("maximumPoolSize", "3");
+ // databaseProps.setProperty("minimumIdle", "2");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemsTableIfNot,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#", "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype(), vo.getItemsManageTable() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ @Override
+ public Long doCreateNewEntryInItemsTable(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateNewEntryInItemsTable,
+ new String[] { "#itemsManageTable#", "#itemname#" },
+ new String[] { vo.getItemsManageTable(), vo.getItemname() });
+ logger.debug("JDBC::doCreateNewEntryInItemsTable sql={}", sql);
+ return Yank.insert(sql, null);
+ }
+
+ @Override
+ public List<ItemsVO> doGetItemTables(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlGetItemTables, new String[] { "#itemsManageTable#" },
+ new String[] { vo.getItemsManageTable() });
+ logger.debug("JDBC::doGetItemTables sql={}", sql);
+ return Yank.queryBeanList(sql, ItemsVO.class, null);
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(sqlInsertItemValue,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ @Override
+ public List<HistoricItem> doGetHistItemFilterQuery(Item item, FilterCriteria filter, int numberDecimalcount,
+ String table, String name) {
+ String sql = histItemFilterQueryProvider(filter, numberDecimalcount, table, name);
+ logger.debug("JDBC::doGetHistItemFilterQuery sql={}", sql);
+ List<Object[]> m = Yank.queryObjectArrays(sql, null);
+
+ List<HistoricItem> items = new ArrayList<>();
+ for (int i = 0; i < m.size(); i++) {
+ items.add(new JdbcHistoricItem(item.getName(), getState(item, m.get(i)[1]), objectAsDate(m.get(i)[0])));
+ }
+ return items;
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+ static final DateTimeFormatter JDBC_DATE_FORMAT = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
+
+ /**
+ * @param filter
+ * @param numberDecimalcount
+ * @param table
+ * @return
+ */
+ private String histItemFilterQueryProvider(FilterCriteria filter, int numberDecimalcount, String table,
+ String simpleName) {
+ logger.debug(
+ "JDBC::getHistItemFilterQueryProvider filter = {}, numberDecimalcount = {}, table = {}, simpleName = {}",
+ filter.toString(), numberDecimalcount, table, simpleName);
+
+ String filterString = "";
+ if (filter.getBeginDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME>'" + JDBC_DATE_FORMAT.format(filter.getBeginDate()) + "'";
+ }
+ if (filter.getEndDate() != null) {
+ filterString += filterString.isEmpty() ? " WHERE" : " AND";
+ filterString += " TIME<'" + JDBC_DATE_FORMAT.format(filter.getEndDate()) + "'";
+ }
+ filterString += (filter.getOrdering() == Ordering.ASCENDING) ? " ORDER BY time ASC" : " ORDER BY time DESC";
+ if (filter.getPageSize() != 0x7fffffff) {
+ // see:
+ // http://www.jooq.org/doc/3.5/manual/sql-building/sql-statements/select-statement/limit-clause/
+ filterString += " OFFSET " + filter.getPageNumber() * filter.getPageSize() + " LIMIT "
+ + filter.getPageSize();
+ }
+ String queryString = "NUMBERITEM".equalsIgnoreCase(simpleName) && numberDecimalcount > -1
+ ? "SELECT time, ROUND(CAST (value AS numeric)," + numberDecimalcount + ") FROM " + table
+ : "SELECT time, value FROM " + table;
+ if (!filterString.isEmpty()) {
+ queryString += filterString;
+ }
+ logger.debug("JDBC::query queryString = {}", queryString);
+ return queryString;
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.db;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Extended Database Configuration class. Class represents
+ * the extended database-specific configuration. Overrides and supplements the
+ * default settings from JdbcBaseDAO. Enter only the differences to JdbcBaseDAO here.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcSqliteDAO extends JdbcBaseDAO {
+ private final Logger logger = LoggerFactory.getLogger(JdbcSqliteDAO.class);
+
+ /********
+ * INIT *
+ ********/
+ public JdbcSqliteDAO() {
+ super();
+ initSqlQueries();
+ initSqlTypes();
+ initDbProps();
+ }
+
+ private void initSqlQueries() {
+ logger.debug("JDBC::initSqlQueries: '{}'", this.getClass().getSimpleName());
+ sqlGetDB = "PRAGMA DATABASE_LIST"; // "SELECT SQLITE_VERSION()"; // "PRAGMA DATABASE_LIST"->db Path/Name
+ // "PRAGMA SCHEMA_VERSION";
+ sqlIfTableExists = "SELECT name FROM sqlite_master WHERE type='table' AND name='#searchTable#'";
+ sqlCreateItemsTableIfNot = "CREATE TABLE IF NOT EXISTS #itemsManageTable# (ItemId INTEGER PRIMARY KEY AUTOINCREMENT, #colname# #coltype# NOT NULL)";
+ sqlInsertItemValue = "INSERT OR IGNORE INTO #tableName# (TIME, VALUE) VALUES( #tablePrimaryValue#, CAST( ? as #dbType#) )";
+ }
+
+ /**
+ * INFO: http://www.java2s.com/Code/Java/Database-SQL-JDBC/StandardSQLDataTypeswithTheirJavaEquivalents.htm
+ */
+ private void initSqlTypes() {
+ logger.debug("JDBC::initSqlTypes: Initialize the type array");
+ sqlTypes.put("tablePrimaryValue", "strftime('%Y-%m-%d %H:%M:%f' , 'now' , 'localtime')");
+ }
+
+ /**
+ * INFO: https://github.com/brettwooldridge/HikariCP
+ */
+ private void initDbProps() {
+ // Properties for HikariCP
+ databaseProps.setProperty("driverClassName", "org.sqlite.JDBC");
+ // driverClassName OR BETTER USE dataSourceClassName
+ // databaseProps.setProperty("dataSourceClassName", "org.sqlite.SQLiteDataSource");
+ }
+
+ /**************
+ * ITEMS DAOs *
+ **************/
+
+ @Override
+ public String doGetDB() {
+ return Yank.queryColumn(sqlGetDB, "file", String.class, null).get(0);
+ }
+
+ @Override
+ public ItemsVO doCreateItemsTableIfNot(ItemsVO vo) {
+ String sql = StringUtilsExt.replaceArrayMerge(sqlCreateItemsTableIfNot,
+ new String[] { "#itemsManageTable#", "#colname#", "#coltype#" },
+ new String[] { vo.getItemsManageTable(), vo.getColname(), vo.getColtype() });
+ logger.debug("JDBC::doCreateItemsTableIfNot sql={}", sql);
+ Yank.execute(sql, null);
+ return vo;
+ }
+
+ /*************
+ * ITEM DAOs *
+ *************/
+ @Override
+ public void doStoreItemValue(Item item, ItemVO vo) {
+ vo = storeItemValueProvider(item, vo);
+ String sql = StringUtilsExt.replaceArrayMerge(sqlInsertItemValue,
+ new String[] { "#tableName#", "#dbType#", "#tablePrimaryValue#" },
+ new String[] { vo.getTableName(), vo.getDbType(), sqlTypes.get("tablePrimaryValue") });
+ Object[] params = new Object[] { vo.getValue() };
+ logger.debug("JDBC::doStoreItemValue sql={} value='{}'", sql, vo.getValue());
+ Yank.execute(sql, params);
+ }
+
+ /****************************
+ * SQL generation Providers *
+ ****************************/
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+
+ /******************************
+ * public Getters and Setters *
+ ******************************/
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.internal;
+
+import java.util.Collections;
+import java.util.Enumeration;
+import java.util.Map;
+import java.util.Properties;
+import java.util.Set;
+import java.util.regex.Matcher;
+import java.util.regex.Pattern;
+
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.persistence.jdbc.db.JdbcBaseDAO;
+import org.openhab.persistence.jdbc.utils.MovingAverage;
+import org.openhab.persistence.jdbc.utils.StringUtilsExt;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Configuration class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcConfiguration {
+ private final Logger logger = LoggerFactory.getLogger(JdbcConfiguration.class);
+
+ private static final Pattern EXTRACT_CONFIG_PATTERN = Pattern.compile("^(.*?)\\.([0-9.a-zA-Z]+)$");
+ private static final String DB_DAO_PACKAGE = "org.openhab.persistence.jdbc.db.Jdbc";
+
+ private Map<Object, Object> configuration;
+
+ private JdbcBaseDAO dBDAO = null;
+ private String dbName = null;
+ boolean dbConnected = false;
+ boolean driverAvailable = false;
+
+ private String serviceName;
+ private String name = "jdbc";
+ public final boolean valid;
+
+ // private String url;
+ // private String user;
+ // private String password;
+ private int numberDecimalcount = 3;
+ private boolean tableUseRealItemNames = false;
+ private String tableNamePrefix = "item";
+ private int tableIdDigitCount = 4;
+ private boolean rebuildTableNames = false;
+
+ private int errReconnectThreshold = 0;
+
+ public int timerCount = 0;
+ public int time1000Statements = 0;
+ public long timer1000 = 0;
+ public MovingAverage timeAverage50arr = new MovingAverage(50);
+ public MovingAverage timeAverage100arr = new MovingAverage(100);
+ public MovingAverage timeAverage200arr = new MovingAverage(200);
+ public boolean enableLogTime = false;
+
+ public JdbcConfiguration(Map<Object, Object> configuration) {
+ logger.debug("JDBC::JdbcConfiguration");
+ valid = updateConfig(configuration);
+ }
+
+ private boolean updateConfig(Map<Object, @Nullable Object> config) {
+ configuration = config;
+
+ logger.debug("JDBC::updateConfig: configuration size = {}", configuration.size());
+
+ String user = (String) configuration.get("user");
+ String password = (String) configuration.get("password");
+
+ // mandatory url
+ String url = (String) configuration.get("url");
+
+ if (url == null) {
+ logger.error("Mandatory url parameter is missing in configuration!");
+ return false;
+ }
+
+ Properties parsedURL = StringUtilsExt.parseJdbcURL(url);
+
+ if (user == null || user.isBlank()) {
+ logger.debug("No jdbc:user parameter defined in jdbc.cfg");
+ }
+ if (password == null || password.isBlank()) {
+ logger.debug("No jdbc:password parameter defined in jdbc.cfg.");
+ }
+
+ if (url.isBlank()) {
+ logger.debug(
+ "JDBC url is missing - please configure in jdbc.cfg like 'jdbc:<service>:<host>[:<port>;<attributes>]'");
+ return false;
+ }
+
+ if ("false".equalsIgnoreCase(parsedURL.getProperty("parseValid"))) {
+ Enumeration<?> en = parsedURL.propertyNames();
+ String enstr = "";
+ for (Object key : Collections.list(en)) {
+ enstr += key + " = " + parsedURL.getProperty("" + key) + "\n";
+ }
+ logger.warn(
+ "JDBC url is not well formatted: {}\nPlease configure in openhab.cfg like 'jdbc:<service>:<host>[:<port>;<attributes>]'",
+ enstr);
+ return false;
+ }
+
+ logger.debug("JDBC::updateConfig: user={}", user);
+ logger.debug("JDBC::updateConfig: password exists? {}", password != null && !password.isBlank());
+ logger.debug("JDBC::updateConfig: url={}", url);
+
+ // set database type and database type class
+ setDBDAOClass(parsedURL.getProperty("dbShortcut")); // derby, h2, hsqldb, mariadb, mysql, postgresql,
+ // sqlite
+ // set user
+ if (user != null && !user.isBlank()) {
+ dBDAO.databaseProps.setProperty("dataSource.user", user);
+ }
+
+ // set password
+ if (password != null && !password.isBlank()) {
+ dBDAO.databaseProps.setProperty("dataSource.password", password);
+ }
+
+ // set sql-types from external config
+ setSqlTypes();
+
+ final Pattern isNumericPattern = Pattern.compile("\\d+(\\.\\d+)?");
+ String et = (String) configuration.get("reconnectCnt");
+ if (et != null && !et.isBlank() && isNumericPattern.matcher(et).matches()) {
+ errReconnectThreshold = Integer.parseInt(et);
+ logger.debug("JDBC::updateConfig: errReconnectThreshold={}", errReconnectThreshold);
+ }
+
+ String np = (String) configuration.get("tableNamePrefix");
+ if (np != null && !np.isBlank()) {
+ tableNamePrefix = np;
+ logger.debug("JDBC::updateConfig: tableNamePrefix={}", tableNamePrefix);
+ }
+
+ String dd = (String) configuration.get("numberDecimalcount");
+ if (dd != null && !dd.isBlank() && isNumericPattern.matcher(dd).matches()) {
+ numberDecimalcount = Integer.parseInt(dd);
+ logger.debug("JDBC::updateConfig: numberDecimalcount={}", numberDecimalcount);
+ }
+
+ String rn = (String) configuration.get("tableUseRealItemNames");
+ if (rn != null && !rn.isBlank()) {
+ tableUseRealItemNames = "true".equals(rn) ? Boolean.parseBoolean(rn) : false;
+ logger.debug("JDBC::updateConfig: tableUseRealItemNames={}", tableUseRealItemNames);
+ }
+
+ String td = (String) configuration.get("tableIdDigitCount");
+ if (td != null && !td.isBlank() && isNumericPattern.matcher(td).matches()) {
+ tableIdDigitCount = Integer.parseInt(td);
+ logger.debug("JDBC::updateConfig: tableIdDigitCount={}", tableIdDigitCount);
+ }
+
+ String rt = (String) configuration.get("rebuildTableNames");
+ if (rt != null && !rt.isBlank()) {
+ rebuildTableNames = Boolean.parseBoolean(rt);
+ logger.debug("JDBC::updateConfig: rebuildTableNames={}", rebuildTableNames);
+ }
+
+ // undocumented
+ String ac = (String) configuration.get("maximumPoolSize");
+ if (ac != null && !ac.isBlank()) {
+ dBDAO.databaseProps.setProperty("maximumPoolSize", ac);
+ }
+
+ // undocumented
+ String ic = (String) configuration.get("minimumIdle");
+ if (ic != null && !ic.isBlank()) {
+ dBDAO.databaseProps.setProperty("minimumIdle", ic);
+ }
+
+ // undocumented
+ String it = (String) configuration.get("idleTimeout");
+ if (it != null && !it.isBlank()) {
+ dBDAO.databaseProps.setProperty("idleTimeout", it);
+ }
+ // undocumented
+ String ent = (String) configuration.get("enableLogTime");
+ if (ent != null && !ent.isBlank()) {
+ enableLogTime = "true".equals(ent) ? Boolean.parseBoolean(ent) : false;
+ }
+ logger.debug("JDBC::updateConfig: enableLogTime {}", enableLogTime);
+
+ // undocumented
+ String fd = (String) configuration.get("driverClassName");
+ if (fd != null && !fd.isBlank()) {
+ dBDAO.databaseProps.setProperty("driverClassName", fd);
+ }
+
+ // undocumented
+ String ds = (String) configuration.get("dataSourceClassName");
+ if (ds != null && !ds.isBlank()) {
+ dBDAO.databaseProps.setProperty("dataSourceClassName", ds);
+ }
+
+ // undocumented
+ String dn = dBDAO.databaseProps.getProperty("driverClassName");
+ if (dn == null) {
+ dn = dBDAO.databaseProps.getProperty("dataSourceClassName");
+ } else {
+ dBDAO.databaseProps.setProperty("jdbcUrl", url);
+ }
+
+ // test if JDBC driver bundle is available
+ testJDBCDriver(dn);
+
+ logger.debug("JDBC::updateConfig: configuration complete. service={}", getName());
+
+ return true;
+ }
+
+ private void setDBDAOClass(String sn) {
+ serviceName = "none";
+
+ // set database type
+ if (sn == null || sn.isBlank() || sn.length() < 2) {
+ logger.error(
+ "JDBC::updateConfig: Required database url like 'jdbc:<service>:<host>[:<port>;<attributes>]' - please configure the jdbc:url parameter in openhab.cfg");
+ } else {
+ serviceName = sn;
+ }
+ logger.debug("JDBC::updateConfig: found serviceName = '{}'", serviceName);
+
+ // set class for database type
+ String ddp = DB_DAO_PACKAGE + serviceName.toUpperCase().charAt(0) + serviceName.toLowerCase().substring(1)
+ + "DAO";
+
+ logger.debug("JDBC::updateConfig: Init Data Access Object Class: '{}'", ddp);
+ try {
+ dBDAO = (JdbcBaseDAO) Class.forName(ddp).newInstance();
+ logger.debug("JDBC::updateConfig: dBDAO ClassName={}", dBDAO.getClass().getName());
+ } catch (InstantiationException e) {
+ logger.error("JDBC::updateConfig: InstantiationException: {}", e.getMessage());
+ } catch (IllegalAccessException e) {
+ logger.error("JDBC::updateConfig: IllegalAccessException: {}", e.getMessage());
+ } catch (ClassNotFoundException e) {
+ logger.warn("JDBC::updateConfig: no Configuration for serviceName '{}' found. ClassNotFoundException: {}",
+ serviceName, e.getMessage());
+ logger.debug("JDBC::updateConfig: using default Database Configuration: JdbcBaseDAO !!");
+ dBDAO = new JdbcBaseDAO();
+ logger.debug("JDBC::updateConfig: dBConfig done");
+ }
+ }
+
+ private void setSqlTypes() {
+ Set<Object> keys = configuration.keySet();
+
+ for (Object k : keys) {
+ String key = (String) k;
+ Matcher matcher = EXTRACT_CONFIG_PATTERN.matcher(key);
+ if (!matcher.matches()) {
+ continue;
+ }
+ matcher.reset();
+ matcher.find();
+ if (!matcher.group(1).equals("sqltype")) {
+ continue;
+ }
+ String itemType = matcher.group(2);
+ if (!itemType.startsWith("table")) {
+ itemType = itemType.toUpperCase() + "ITEM";
+ }
+ String value = (String) configuration.get(key);
+ logger.debug("JDBC::updateConfig: set sqlTypes: itemType={} value={}", itemType, value);
+ dBDAO.sqlTypes.put(itemType, value);
+ }
+ }
+
+ private void testJDBCDriver(String driver) {
+ driverAvailable = true;
+ try {
+ Class.forName(driver);
+ logger.debug("JDBC::updateConfig: load JDBC-driverClass was successful: '{}'", driver);
+ } catch (ClassNotFoundException e) {
+ driverAvailable = false;
+ logger.error(
+ "JDBC::updateConfig: could NOT load JDBC-driverClassName or JDBC-dataSourceClassName. ClassNotFoundException: '{}'",
+ e.getMessage());
+ String warn = ""
+ + "\n\n\t!!!\n\tTo avoid this error, place an appropriate JDBC driver file for serviceName '{}' in addons directory.\n"
+ + "\tCopy missing JDBC-Driver-jar to your OpenHab/addons Folder.\n\t!!!\n" + "\tDOWNLOAD: \n";
+ if (serviceName.equals("derby")) {
+ warn += "\tDerby: version >= 10.11.1.1 from http://mvnrepository.com/artifact/org.apache.derby/derby\n";
+ } else if (serviceName.equals("h2")) {
+ warn += "\tH2: version >= 1.4.189 from http://mvnrepository.com/artifact/com.h2database/h2\n";
+ } else if (serviceName.equals("hsqldb")) {
+ warn += "\tHSQLDB: version >= 2.3.3 from http://mvnrepository.com/artifact/org.hsqldb/hsqldb\n";
+ } else if (serviceName.equals("mariadb")) {
+ warn += "\tMariaDB: version >= 1.2.0 from http://mvnrepository.com/artifact/org.mariadb.jdbc/mariadb-java-client\n";
+ } else if (serviceName.equals("mysql")) {
+ warn += "\tMySQL: version >= 5.1.36 from http://mvnrepository.com/artifact/mysql/mysql-connector-java\n";
+ } else if (serviceName.equals("postgresql")) {
+ warn += "\tPostgreSQL:version >= 9.4.1208 from http://mvnrepository.com/artifact/org.postgresql/postgresql\n";
+ } else if (serviceName.equals("sqlite")) {
+ warn += "\tSQLite: version >= 3.16.1 from http://mvnrepository.com/artifact/org.xerial/sqlite-jdbc\n";
+ }
+ logger.warn(warn, serviceName);
+ }
+ }
+
+ public Properties getHikariConfiguration() {
+ return dBDAO.databaseProps;
+ }
+
+ public String getName() {
+ // return serviceName;
+ return name;
+ }
+
+ public String getServiceName() {
+ return serviceName;
+ }
+
+ public String getTableNamePrefix() {
+ return tableNamePrefix;
+ }
+
+ public int getErrReconnectThreshold() {
+ return errReconnectThreshold;
+ }
+
+ public boolean getRebuildTableNames() {
+ return rebuildTableNames;
+ }
+
+ public int getNumberDecimalcount() {
+ return numberDecimalcount;
+ }
+
+ public boolean getTableUseRealItemNames() {
+ return tableUseRealItemNames;
+ }
+
+ public int getTableIdDigitCount() {
+ return tableIdDigitCount;
+ }
+
+ public JdbcBaseDAO getDBDAO() {
+ return dBDAO;
+ }
+
+ public String getDbName() {
+ return dbName;
+ }
+
+ public void setDbName(String dbName) {
+ this.dbName = dbName;
+ }
+
+ public boolean isDbConnected() {
+ return dbConnected;
+ }
+
+ public void setDbConnected(boolean dbConnected) {
+ logger.debug("JDBC::setDbConnected {}", dbConnected);
+ // Initializing step, after db is connected.
+ // Initialize sqlTypes, depending on DB version for example
+ dBDAO.initAfterFirstDbConnection();
+ // Running once again to prior external configured SqlTypes!
+ setSqlTypes();
+ this.dbConnected = dbConnected;
+ }
+
+ public boolean isDriverAvailable() {
+ return driverAvailable;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.internal;
+
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.stream.Collectors;
+
+import org.knowm.yank.Yank;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.persistence.jdbc.model.ItemVO;
+import org.openhab.persistence.jdbc.model.ItemsVO;
+import org.openhab.persistence.jdbc.model.JdbcPersistenceItemInfo;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Mapper class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcMapper {
+ private final Logger logger = LoggerFactory.getLogger(JdbcMapper.class);
+
+ // Error counter - used to reconnect to database on error
+ protected int errCnt;
+ protected boolean initialized = false;
+ protected JdbcConfiguration conf = null;
+ protected final Map<String, String> sqlTables = new HashMap<>();
+ private long afterAccessMin = 10000;
+ private long afterAccessMax = 0;
+ private static final String ITEM_NAME_PATTERN = "[^a-zA-Z_0-9\\-]";
+
+ /*****************
+ * MAPPER ITEMS *
+ *****************/
+ public boolean pingDB() {
+ logger.debug("JDBC::pingDB");
+ boolean ret = false;
+ long timerStart = System.currentTimeMillis();
+ if (openConnection()) {
+ if (conf.getDbName() == null) {
+ logger.debug(
+ "JDBC::pingDB asking db for name as absolutely first db action, after connection is established.");
+ String dbName = conf.getDBDAO().doGetDB();
+ conf.setDbName(dbName);
+ ret = dbName.length() > 0;
+ } else {
+ ret = conf.getDBDAO().doPingDB() > 0;
+ }
+ }
+ logTime("pingDB", timerStart, System.currentTimeMillis());
+ return ret;
+ }
+
+ public String getDB() {
+ logger.debug("JDBC::getDB");
+ long timerStart = System.currentTimeMillis();
+ String res = conf.getDBDAO().doGetDB();
+ logTime("pingDB", timerStart, System.currentTimeMillis());
+ return res;
+ }
+
+ public ItemsVO createNewEntryInItemsTable(ItemsVO vo) {
+ logger.debug("JDBC::createNewEntryInItemsTable");
+ long timerStart = System.currentTimeMillis();
+ Long i = conf.getDBDAO().doCreateNewEntryInItemsTable(vo);
+ vo.setItemid(i.intValue());
+ logTime("doCreateNewEntryInItemsTable", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public boolean createItemsTableIfNot(ItemsVO vo) {
+ logger.debug("JDBC::createItemsTableIfNot");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doCreateItemsTableIfNot(vo);
+ logTime("doCreateItemsTableIfNot", timerStart, System.currentTimeMillis());
+ return true;
+ }
+
+ public ItemsVO deleteItemsEntry(ItemsVO vo) {
+ logger.debug("JDBC::deleteItemsEntry");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doDeleteItemsEntry(vo);
+ logTime("deleteItemsEntry", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public List<ItemsVO> getItemIDTableNames() {
+ logger.debug("JDBC::getItemIDTableNames");
+ long timerStart = System.currentTimeMillis();
+ List<ItemsVO> vo = conf.getDBDAO().doGetItemIDTableNames(new ItemsVO());
+ logTime("getItemIDTableNames", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public List<ItemsVO> getItemTables() {
+ logger.debug("JDBC::getItemTables");
+ long timerStart = System.currentTimeMillis();
+ ItemsVO vo = new ItemsVO();
+ vo.setJdbcUriDatabaseName(conf.getDbName());
+ List<ItemsVO> vol = conf.getDBDAO().doGetItemTables(vo);
+ logTime("getItemTables", timerStart, System.currentTimeMillis());
+ return vol;
+ }
+
+ /****************
+ * MAPPERS ITEM *
+ ****************/
+ public void updateItemTableNames(List<ItemVO> vol) {
+ logger.debug("JDBC::updateItemTableNames");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doUpdateItemTableNames(vol);
+ logTime("updateItemTableNames", timerStart, System.currentTimeMillis());
+ }
+
+ public ItemVO createItemTable(ItemVO vo) {
+ logger.debug("JDBC::createItemTable");
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doCreateItemTable(vo);
+ logTime("createItemTable", timerStart, System.currentTimeMillis());
+ return vo;
+ }
+
+ public Item storeItemValue(Item item) {
+ logger.debug("JDBC::storeItemValue: item={}", item.toString());
+ String tableName = getTable(item);
+ if (tableName == null) {
+ logger.error("JDBC::store: Unable to store item '{}'.", item.getName());
+ return item;
+ }
+ long timerStart = System.currentTimeMillis();
+ conf.getDBDAO().doStoreItemValue(item, new ItemVO(tableName, null));
+ logTime("storeItemValue", timerStart, System.currentTimeMillis());
+ errCnt = 0;
+ return item;
+ }
+
+ public List<HistoricItem> getHistItemFilterQuery(FilterCriteria filter, int numberDecimalcount, String table,
+ Item item) {
+ logger.debug(
+ "JDBC::getHistItemFilterQuery filter='{}' numberDecimalcount='{}' table='{}' item='{}' itemName='{}'",
+ (filter != null), numberDecimalcount, table, item, item.getName());
+ if (table != null) {
+ long timerStart = System.currentTimeMillis();
+ List<HistoricItem> r = conf.getDBDAO().doGetHistItemFilterQuery(item, filter, numberDecimalcount, table,
+ item.getName());
+ logTime("insertItemValue", timerStart, System.currentTimeMillis());
+ return r;
+ } else {
+ logger.error("JDBC::getHistItemFilterQuery: TABLE is NULL; cannot get data from non-existent table.");
+ }
+ return null;
+ }
+
+ /***********************
+ * DATABASE CONNECTION *
+ ***********************/
+ protected boolean openConnection() {
+ logger.debug("JDBC::openConnection isDriverAvailable: {}", conf.isDriverAvailable());
+ if (conf.isDriverAvailable() && !conf.isDbConnected()) {
+ logger.info("JDBC::openConnection: Driver is available::Yank setupDataSource");
+ Yank.setupDefaultConnectionPool(conf.getHikariConfiguration());
+ conf.setDbConnected(true);
+ return true;
+ } else if (!conf.isDriverAvailable()) {
+ logger.warn("JDBC::openConnection: no driver available!");
+ initialized = false;
+ return false;
+ }
+ return true;
+ }
+
+ protected void closeConnection() {
+ logger.debug("JDBC::closeConnection");
+ // Closes all open connection pools
+ Yank.releaseDefaultConnectionPool();
+ conf.setDbConnected(false);
+ }
+
+ protected boolean checkDBAccessability() {
+ // Check if connection is valid
+ if (initialized) {
+ return true;
+ }
+ // first
+ boolean p = pingDB();
+ if (p) {
+ logger.debug("JDBC::checkDBAcessability, first try connection: {}", p);
+ return (p && !(conf.getErrReconnectThreshold() > 0 && errCnt <= conf.getErrReconnectThreshold()));
+ } else {
+ // second
+ p = pingDB();
+ logger.debug("JDBC::checkDBAcessability, second try connection: {}", p);
+ return (p && !(conf.getErrReconnectThreshold() > 0 && errCnt <= conf.getErrReconnectThreshold()));
+ }
+ }
+
+ /**************************
+ * DATABASE TABLEHANDLING *
+ **************************/
+ protected void checkDBSchema() {
+ // Create Items Table if does not exist
+ createItemsTableIfNot(new ItemsVO());
+ if (conf.getRebuildTableNames()) {
+ formatTableNames();
+ logger.info(
+ "JDBC::checkDBSchema: Rebuild complete, configure the 'rebuildTableNames' setting to 'false' to stop rebuilds on startup");
+ } else {
+ List<ItemsVO> al;
+ // Reset the error counter
+ errCnt = 0;
+ al = getItemIDTableNames();
+ for (int i = 0; i < al.size(); i++) {
+ String t = getTableName(al.get(i).getItemid(), al.get(i).getItemname());
+ sqlTables.put(al.get(i).getItemname(), t);
+ }
+ }
+ }
+
+ protected String getTable(Item item) {
+ int rowId = 0;
+ ItemsVO isvo;
+ ItemVO ivo;
+
+ String itemName = item.getName();
+ String tableName = sqlTables.get(itemName);
+
+ // Table already exists - return the name
+ if (tableName != null) {
+ return tableName;
+ }
+
+ logger.debug("JDBC::getTable: no table found for item '{}' in sqlTables", itemName);
+
+ // Create a new entry in items table
+ isvo = new ItemsVO();
+ isvo.setItemname(itemName);
+ isvo = createNewEntryInItemsTable(isvo);
+ rowId = isvo.getItemid();
+ if (rowId == 0) {
+ logger.error("JDBC::getTable: Creating table for item '{}' failed.", itemName);
+ }
+ // Create the table name
+ logger.debug("JDBC::getTable: getTableName with rowId={} itemName={}", rowId, itemName);
+ tableName = getTableName(rowId, itemName);
+
+ // An error occurred adding the item name into the index list!
+ if (tableName == null) {
+ logger.error("JDBC::getTable: tableName was null; could not create a table for item '{}'", itemName);
+ return null;
+ }
+
+ // Create table for item
+ String dataType = conf.getDBDAO().getDataType(item);
+ ivo = new ItemVO(tableName, itemName);
+ ivo.setDbType(dataType);
+ ivo = createItemTable(ivo);
+ logger.debug("JDBC::getTable: Table created for item '{}' with dataType {} in SQL database.", itemName,
+ dataType);
+ sqlTables.put(itemName, tableName);
+
+ // Check if the new entry is in the table list
+ // If it's not in the list, then there was an error and we need to do
+ // some tidying up
+ // The item needs to be removed from the index table to avoid duplicates
+ if (sqlTables.get(itemName) == null) {
+ logger.error("JDBC::getTable: Item '{}' was not added to the table - removing index", itemName);
+ isvo = new ItemsVO();
+ isvo.setItemname(itemName);
+ deleteItemsEntry(isvo);
+ }
+
+ return tableName;
+ }
+
+ private void formatTableNames() {
+ boolean tmpinit = initialized;
+ if (tmpinit) {
+ initialized = false;
+ }
+
+ List<ItemsVO> al;
+ Map<Integer, String> tableIds = new HashMap<>();
+
+ //
+ al = getItemIDTableNames();
+ for (int i = 0; i < al.size(); i++) {
+ String t = getTableName(al.get(i).getItemid(), al.get(i).getItemname());
+ sqlTables.put(al.get(i).getItemname(), t);
+ tableIds.put(al.get(i).getItemid(), t);
+ }
+
+ //
+ al = getItemTables();
+
+ String oldName = "";
+ String newName = "";
+ List<ItemVO> oldNewTablenames = new ArrayList<>();
+ for (int i = 0; i < al.size(); i++) {
+ int id = -1;
+ oldName = al.get(i).getTable_name();
+ logger.info("JDBC::formatTableNames: found Table Name= {}", oldName);
+
+ if (oldName.startsWith(conf.getTableNamePrefix()) && !oldName.contains("_")) {
+ id = Integer.parseInt(oldName.substring(conf.getTableNamePrefix().length()));
+ logger.info("JDBC::formatTableNames: found Table with Prefix '{}' Name= {} id= {}",
+ conf.getTableNamePrefix(), oldName, (id));
+ } else if (oldName.contains("_")) {
+ id = Integer.parseInt(oldName.substring(oldName.lastIndexOf("_") + 1));
+ logger.info("JDBC::formatTableNames: found Table Name= {} id= {}", oldName, (id));
+ }
+ logger.info("JDBC::formatTableNames: found Table id= {}", id);
+
+ newName = tableIds.get(id);
+ logger.info("JDBC::formatTableNames: found Table newName= {}", newName);
+
+ if (newName != null) {
+ if (!oldName.equalsIgnoreCase(newName)) {
+ oldNewTablenames.add(new ItemVO(oldName, newName));
+ logger.info("JDBC::formatTableNames: Table '{}' will be renamed to '{}'", oldName, newName);
+ } else {
+ logger.info("JDBC::formatTableNames: Table oldName='{}' newName='{}' nothing to rename", oldName,
+ newName);
+ }
+ } else {
+ logger.error("JDBC::formatTableNames: Table '{}' could NOT be renamed to '{}'", oldName, newName);
+ break;
+ }
+ }
+
+ updateItemTableNames(oldNewTablenames);
+ logger.info("JDBC::formatTableNames: Finished updating {} item table names", oldNewTablenames.size());
+
+ initialized = tmpinit;
+ }
+
+ private String getTableName(int rowId, String itemName) {
+ return getTableNamePrefix(itemName) + formatRight(rowId, conf.getTableIdDigitCount());
+ }
+
+ private String getTableNamePrefix(String itemName) {
+ String name = conf.getTableNamePrefix();
+ if (conf.getTableUseRealItemNames()) {
+ // Create the table name with real Item Names
+ name = (itemName.replaceAll(ITEM_NAME_PATTERN, "") + "_").toLowerCase();
+ }
+ return name;
+ }
+
+ public Set<PersistenceItemInfo> getItems() {
+ // TODO: in general it would be possible to query the count, earliest and latest values for each item too but it
+ // would be a very costly operation
+ return sqlTables.keySet().stream().map(itemName -> new JdbcPersistenceItemInfo(itemName))
+ .collect(Collectors.<PersistenceItemInfo> toUnmodifiableSet());
+ }
+
+ private static String formatRight(final Object value, final int len) {
+ final String valueAsString = String.valueOf(value);
+ if (valueAsString.length() < len) {
+ final StringBuffer result = new StringBuffer(len);
+ for (int i = len - valueAsString.length(); i > 0; i--) {
+ result.append('0');
+ }
+ result.append(valueAsString);
+ return result.toString();
+ } else {
+ return valueAsString;
+ }
+ }
+
+ /*****************
+ * H E L P E R S *
+ *****************/
+ private void logTime(String me, long timerStart, long timerStop) {
+ if (conf.enableLogTime && logger.isInfoEnabled()) {
+ conf.timerCount++;
+ int timerDiff = (int) (timerStop - timerStart);
+ if (timerDiff < afterAccessMin) {
+ afterAccessMin = timerDiff;
+ }
+ if (timerDiff > afterAccessMax) {
+ afterAccessMax = timerDiff;
+ }
+ conf.timeAverage50arr.add(timerDiff);
+ conf.timeAverage100arr.add(timerDiff);
+ conf.timeAverage200arr.add(timerDiff);
+ if (conf.timerCount == 1) {
+ conf.timer1000 = System.currentTimeMillis();
+ }
+ if (conf.timerCount == 1001) {
+ conf.time1000Statements = Math.round(((int) (System.currentTimeMillis() - conf.timer1000)) / 1000);// Seconds
+ conf.timerCount = 0;
+ }
+ logger.info(
+ "JDBC::logTime: '{}':\n afterAccess = {} ms\n timeAverage50 = {} ms\n timeAverage100 = {} ms\n timeAverage200 = {} ms\n afterAccessMin = {} ms\n afterAccessMax = {} ms\n 1000Statements = {} sec\n statementCount = {}\n",
+ me, timerDiff, conf.timeAverage50arr.getAverageInteger(),
+ conf.timeAverage100arr.getAverageInteger(), conf.timeAverage200arr.getAverageInteger(),
+ afterAccessMin, afterAccessMax, conf.time1000Statements, conf.timerCount);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.internal;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.UnDefType;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This is the implementation of the JDBC {@link PersistenceService}.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ * @author Kai Kreuzer - Migration to 3.x
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.jdbc", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class JdbcPersistenceService extends JdbcMapper implements QueryablePersistenceService {
+
+ private final Logger logger = LoggerFactory.getLogger(JdbcPersistenceService.class);
+
+ private final ItemRegistry itemRegistry;
+
+ @Activate
+ public JdbcPersistenceService(final @Reference ItemRegistry itemRegistry) {
+ this.itemRegistry = itemRegistry;
+ }
+
+ /**
+ * Called by the SCR to activate the component with its configuration read
+ * from CAS
+ *
+ * @param bundleContext
+ * BundleContext of the Bundle that defines this component
+ * @param configuration
+ * Configuration properties for this component obtained from the
+ * ConfigAdmin service
+ */
+ @Activate
+ public void activate(BundleContext bundleContext, Map<Object, Object> configuration) {
+ logger.debug("JDBC::activate: persistence service activated");
+ updateConfig(configuration);
+ }
+
+ /**
+ * Called by the SCR to deactivate the component when either the
+ * configuration is removed or mandatory references are no longer satisfied
+ * or the component has simply been stopped.
+ *
+ * @param reason
+ * Reason code for the deactivation:<br>
+ * <ul>
+ * <li>0 – Unspecified
+ * <li>1 – The component was disabled
+ * <li>2 – A reference became unsatisfied
+ * <li>3 – A configuration was changed
+ * <li>4 – A configuration was deleted
+ * <li>5 – The component was disposed
+ * <li>6 – The bundle was stopped
+ * </ul>
+ */
+ @Deactivate
+ public void deactivate(final int reason) {
+ logger.debug("JDBC::deactivate: persistence bundle stopping. Disconnecting from database. reason={}", reason);
+ // closeConnection();
+ initialized = false;
+ }
+
+ @Override
+ public String getId() {
+ logger.debug("JDBC::getName: returning name 'jdbc' for queryable persistence service.");
+ return "jdbc";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "JDBC";
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ /**
+ * @{inheritDoc
+ */
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ // Don not store undefined/uninitialised data
+ if (item.getState() instanceof UnDefType) {
+ logger.debug("JDBC::store: ignore Item '{}' because it is UnDefType", item.getName());
+ return;
+ }
+ if (!checkDBAccessability()) {
+ logger.warn(
+ "JDBC::store: No connection to database. Cannot persist item '{}'! Will retry connecting to database when error count:{} equals errReconnectThreshold:{}",
+ item, errCnt, conf.getErrReconnectThreshold());
+ return;
+ }
+ long timerStart = System.currentTimeMillis();
+ storeItemValue(item);
+ logger.debug("JDBC: Stored item '{}' as '{}' in SQL database at {} in {} ms.", item.getName(),
+ item.getState().toString(), (new java.util.Date()).toString(), System.currentTimeMillis() - timerStart);
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ return getItems();
+ }
+
+ /**
+ * Queries the {@link PersistenceService} for data with a given filter
+ * criteria
+ *
+ * @param filter
+ * the filter to apply to the query
+ * @return a time series of items
+ */
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ if (!checkDBAccessability()) {
+ logger.warn("JDBC::query: database not connected, query aborted for item '{}'", filter.getItemName());
+ return Collections.emptyList();
+ }
+
+ // Get the item name from the filter
+ // Also get the Item object so we can determine the type
+ Item item = null;
+ String itemName = filter.getItemName();
+ logger.debug("JDBC::query: item is {}", itemName);
+ try {
+ item = itemRegistry.getItem(itemName);
+ } catch (ItemNotFoundException e1) {
+ logger.error("JDBC::query: unable to get item for itemName: '{}'. Ignore and give up!", itemName);
+ return Collections.emptyList();
+ }
+
+ if (item instanceof GroupItem) {
+ // For Group Item is BaseItem needed to get correct Type of Value.
+ item = GroupItem.class.cast(item).getBaseItem();
+ logger.debug("JDBC::query: item is instanceof GroupItem '{}'", itemName);
+ if (item == null) {
+ logger.debug("JDBC::query: BaseItem of GroupItem is null. Ignore and give up!");
+ return Collections.emptyList();
+ }
+ if (item instanceof GroupItem) {
+ logger.debug("JDBC::query: BaseItem of GroupItem is a GroupItem too. Ignore and give up!");
+ return Collections.emptyList();
+ }
+ }
+
+ String table = sqlTables.get(itemName);
+ if (table == null) {
+ logger.warn(
+ "JDBC::query: unable to find table for query, no data in database for item '{}'. Current number of tables in the database: {}",
+ itemName, sqlTables.size());
+ // if enabled, table will be created immediately
+ logger.warn("JDBC::query: try to generate the table for item '{}'", itemName);
+ table = getTable(item);
+ }
+
+ long timerStart = System.currentTimeMillis();
+ List<HistoricItem> items = new ArrayList<>();
+ items = getHistItemFilterQuery(filter, conf.getNumberDecimalcount(), table, item);
+
+ logger.debug("JDBC::query: query for {} returned {} rows in {} ms", item.getName(), items.size(),
+ System.currentTimeMillis() - timerStart);
+
+ // Success
+ errCnt = 0;
+ return items;
+ }
+
+ public void updateConfig(Map<Object, Object> configuration) {
+ logger.debug("JDBC::updateConfig");
+
+ conf = new JdbcConfiguration(configuration);
+ if (conf.valid && checkDBAccessability()) {
+ checkDBSchema();
+ // connection has been established ... initialization completed!
+ initialized = true;
+ } else {
+ initialized = false;
+ }
+
+ logger.debug("JDBC::updateConfig: configuration complete for service={}.", getId());
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.io.Serializable;
+import java.util.Date;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Represents the Item-data on the part of MyBatis/database.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class ItemVO implements Serializable {
+ private final Logger logger = LoggerFactory.getLogger(ItemVO.class);
+
+ private static final long serialVersionUID = 1871441039821454890L;
+
+ private String tableName;
+ private String newTableName;
+ private String dbType;
+ private String jdbcType;
+ private String itemType;
+ private Class<?> javaType;
+ private Date time;
+ private Object value;
+
+ public ItemVO(String tableName, String newTableName) {
+ logger.debug("JDBC:ItemVO tableName={}; newTableName={}; ", tableName, newTableName);
+ this.tableName = tableName;
+ this.newTableName = newTableName;
+ }
+
+ public ItemVO() {
+ }
+
+ public void setValueTypes(String dbType, Class<?> javaType) {
+ logger.debug("JDBC:ItemVO setValueTypes dbType={}; javaType={};", dbType, javaType);
+ this.dbType = dbType;
+ this.javaType = javaType;
+ }
+
+ public String getTableName() {
+ return tableName;
+ }
+
+ public void setTableName(String tableName) {
+ this.tableName = tableName;
+ }
+
+ public String getNewTableName() {
+ return newTableName;
+ }
+
+ public void setNewTableName(String newTableName) {
+ this.newTableName = newTableName;
+ }
+
+ public String getDbType() {
+ return dbType;
+ }
+
+ public void setDbType(String dbType) {
+ this.dbType = dbType;
+ }
+
+ public String getJdbcType() {
+ return jdbcType;
+ }
+
+ public void setJdbcType(String jdbcType) {
+ this.jdbcType = jdbcType;
+ }
+
+ public String getItemType() {
+ return itemType;
+ }
+
+ public void setItemType(String itemType) {
+ this.itemType = itemType;
+ }
+
+ public String getJavaType() {
+ return javaType.getName();
+ }
+
+ public void setJavaType(Class<?> javaType) {
+ this.javaType = javaType;
+ }
+
+ public Date getTime() {
+ return time;
+ }
+
+ public void setTime(Date time) {
+ this.time = time;
+ }
+
+ public Object getValue() {
+ return value;
+ }
+
+ public void setValue(Object value) {
+ this.value = value;
+ }
+
+ /**
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#equals(java.lang.Object)
+ */
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ItemVO other = (ItemVO) obj;
+ if (value == null) {
+ if (other.value != null) {
+ return false;
+ }
+ } else if (!value.equals(other.value)) {
+ return false;
+ }
+ if (time != other.time) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("ItemVO [tableName=");
+ builder.append(tableName);
+ builder.append(", newTableName=");
+ builder.append(newTableName);
+ builder.append(", dbType=");
+ builder.append(dbType);
+ builder.append(", javaType=");
+ builder.append(javaType);
+ builder.append(", time=");
+ builder.append(time);
+ builder.append(", value=");
+ builder.append(value);
+ builder.append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.io.Serializable;
+
+/**
+ * Represents the table naming data.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class ItemsVO implements Serializable {
+
+ private static final long serialVersionUID = 2871961811177601520L;
+
+ private static final String STR_FILTER = "[^a-zA-Z0-9]";
+
+ private String coltype = "VARCHAR(500)";
+ private String colname = "itemname";
+ private String itemsManageTable = "items";
+ private int itemid;
+ private String itemname;
+ private String table_name;
+ private String jdbcUriDatabaseName;
+
+ public String getColtype() {
+ return coltype;
+ }
+
+ public void setColtype(String coltype) {
+ this.coltype = coltype.replaceAll(STR_FILTER, "");
+ }
+
+ public String getColname() {
+ return colname;
+ }
+
+ public void setColname(String colname) {
+ this.colname = colname.replaceAll(STR_FILTER, "");
+ }
+
+ public String getItemsManageTable() {
+ return itemsManageTable;
+ }
+
+ public void setItemsManageTable(String itemsManageTable) {
+ this.itemsManageTable = itemsManageTable.replaceAll(STR_FILTER, "");
+ }
+
+ public int getItemid() {
+ return itemid;
+ }
+
+ public void setItemid(int itemid) {
+ this.itemid = itemid;
+ }
+
+ public String getItemname() {
+ return itemname;
+ }
+
+ public void setItemname(String itemname) {
+ this.itemname = itemname;
+ }
+
+ public String getTable_name() {
+ return table_name;
+ }
+
+ public void setTable_name(String table_name) {
+ this.table_name = table_name;
+ }
+
+ public String getJdbcUriDatabaseName() {
+ return jdbcUriDatabaseName;
+ }
+
+ public void setJdbcUriDatabaseName(String jdbcUriDatabaseName) {
+ this.jdbcUriDatabaseName = jdbcUriDatabaseName;
+ }
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#hashCode()
+ */
+ @Override
+ public int hashCode() {
+ final int prime = 31;
+ int result = 1;
+ result = prime * result + ((itemname == null) ? 0 : itemname.hashCode());
+ result = prime * result + (itemid ^ (itemid >>> 32));
+ return result;
+ }
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#equals(java.lang.Object)
+ */
+ @Override
+ public boolean equals(Object obj) {
+ if (this == obj) {
+ return true;
+ }
+ if (obj == null) {
+ return false;
+ }
+ if (getClass() != obj.getClass()) {
+ return false;
+ }
+ ItemsVO other = (ItemsVO) obj;
+ if (itemname == null) {
+ if (other.itemname != null) {
+ return false;
+ }
+ } else if (!itemname.equals(other.itemname)) {
+ return false;
+ }
+ if (itemid != other.itemid) {
+ return false;
+ }
+ return true;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("ItemsVO [coltype=");
+ builder.append(coltype);
+ builder.append(", colname=");
+ builder.append(colname);
+ builder.append(", itemsManageTable=");
+ builder.append(itemsManageTable);
+ builder.append(", itemid=");
+ builder.append(itemid);
+ builder.append(", itemname=");
+ builder.append(itemname);
+ builder.append(", table_name=");
+ builder.append(table_name);
+ builder.append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.time.ZonedDateTime;
+
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * Represents the data on the part of openHAB.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class JdbcHistoricItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final ZonedDateTime timestamp;
+
+ public JdbcHistoricItem(String name, State state, ZonedDateTime timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder builder = new StringBuilder();
+ builder.append("JdbcItem [name=");
+ builder.append(name);
+ builder.append(", state=");
+ builder.append(state);
+ builder.append(", timestamp=");
+ builder.append(timestamp);
+ builder.append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.model;
+
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.persistence.PersistenceItemInfo;
+
+/**
+ * Represents the item info for openHAB.
+ *
+ * @author Christoph Weitkamp - Initial contribution
+ */
+@NonNullByDefault
+public class JdbcPersistenceItemInfo implements PersistenceItemInfo {
+
+ private final String name;
+ private final @Nullable Integer count;
+ private final @Nullable Date earliest;
+ private final @Nullable Date latest;
+
+ public JdbcPersistenceItemInfo(String name) {
+ this(name, null, null, null);
+ }
+
+ public JdbcPersistenceItemInfo(String name, @Nullable Integer count, @Nullable Date earliest,
+ @Nullable Date latest) {
+ this.name = name;
+ this.count = count;
+ this.earliest = earliest;
+ this.latest = latest;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public @Nullable Integer getCount() {
+ return count;
+ }
+
+ @Override
+ public @Nullable Date getEarliest() {
+ return earliest;
+ }
+
+ @Override
+ public @Nullable Date getLatest() {
+ return latest;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.utils;
+
+import java.sql.DatabaseMetaData;
+import java.sql.SQLException;
+
+import org.knowm.yank.Yank;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.zaxxer.hikari.HikariDataSource;
+
+/**
+ * Meta data class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+public class DbMetaData {
+
+ private final Logger logger = LoggerFactory.getLogger(DbMetaData.class);
+
+ private int dbMajorVersion;
+ private int dbMinorVersion;
+ private int driverMajorVersion;
+ private int driverMinorVersion;
+ private String dbProductName;
+ private String dbProductVersion;
+
+ public DbMetaData() {
+ HikariDataSource h = Yank.getDefaultConnectionPool();
+ // HikariDataSource h = Yank.getDataSource();
+
+ DatabaseMetaData meta;
+ try {
+ meta = h.getConnection().getMetaData();
+
+ // Oracle (and some other vendors) do not support
+ // some the following methods; therefore, we need
+ // to use try-catch block.
+ try {
+ dbMajorVersion = meta.getDatabaseMajorVersion();
+ logger.debug("dbMajorVersion = '{}'", dbMajorVersion);
+ } catch (Exception e) {
+ logger.error("Asking for 'dbMajorVersion' is unsupported: '{}'", e.getMessage());
+ }
+
+ try {
+ dbMinorVersion = meta.getDatabaseMinorVersion();
+ logger.debug("dbMinorVersion = '{}'", dbMinorVersion);
+ } catch (Exception e) {
+ logger.error("Asking for 'dbMinorVersion' is unsupported: '{}'", e.getMessage());
+ }
+
+ driverMajorVersion = meta.getDriverMajorVersion();
+ logger.debug("driverMajorVersion = '{}'", driverMajorVersion);
+
+ driverMinorVersion = meta.getDriverMinorVersion();
+ logger.debug("driverMinorVersion = '{}'", driverMinorVersion);
+
+ dbProductName = meta.getDatabaseProductName();
+ logger.debug("dbProductName = '{}'", dbProductName);
+
+ dbProductVersion = meta.getDatabaseProductVersion();
+ logger.debug("dbProductVersion = '{}'", dbProductVersion);
+ } catch (SQLException e1) {
+ logger.error("Asking for 'dbMajorVersion' seems to be unsupported: '{}'", e1.getMessage());
+ }
+ }
+
+ public int getDbMajorVersion() {
+ return dbMajorVersion;
+ }
+
+ public int getDbMinorVersion() {
+ return dbMinorVersion;
+ }
+
+ public boolean isDbVersionGreater(int major, int minor) {
+ if (dbMajorVersion > major) {
+ return true;
+ } else if (dbMajorVersion == major) {
+ if (dbMinorVersion > minor) {
+ return true;
+ }
+ }
+ return false;
+ }
+
+ public int getDriverMajorVersion() {
+ return driverMajorVersion;
+ }
+
+ public int getDriverMinorVersion() {
+ return driverMinorVersion;
+ }
+
+ public boolean isDriverVersionGreater(int major, int minor) {
+ if (major > driverMajorVersion) {
+ return true;
+ } else if (major == driverMajorVersion) {
+ if (minor > driverMinorVersion) {
+ return true;
+ }
+ }
+ return false;
+ }
+
+ public String getDbProductName() {
+ return dbProductName;
+ }
+
+ public String getDbProductVersion() {
+ return dbProductVersion;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.utils;
+
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.LinkedList;
+import java.util.Queue;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Calculates the average/mean of a number series.
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+@NonNullByDefault
+public class MovingAverage {
+
+ private final Queue<BigDecimal> win = new LinkedList<>();
+ private final int period;
+ private BigDecimal sum = BigDecimal.ZERO;
+
+ public MovingAverage(int period) {
+ assert period > 0 : "Period must be a positive integer";
+ this.period = period;
+ }
+
+ public void add(Double num) {
+ add(new BigDecimal(num));
+ }
+
+ public void add(Long num) {
+ add(new BigDecimal(num));
+ }
+
+ public void add(Integer num) {
+ add(new BigDecimal(num));
+ }
+
+ public void add(BigDecimal num) {
+ sum = sum.add(num);
+ win.add(num);
+ if (win.size() > period) {
+ sum = sum.subtract(win.remove());
+ }
+ }
+
+ public BigDecimal getAverage() {
+ if (win.isEmpty()) {
+ return BigDecimal.ZERO; // technically the average is undefined
+ }
+ BigDecimal divisor = BigDecimal.valueOf(win.size());
+ return sum.divide(divisor, 2, RoundingMode.HALF_UP);
+ }
+
+ public double getAverageDouble() {
+ return getAverage().doubleValue();
+ }
+
+ public int getAverageInteger() {
+ return getAverage().intValue();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jdbc.utils;
+
+import java.net.URI;
+import java.net.URISyntaxException;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.Properties;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.persistence.FilterCriteria;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Utility class
+ *
+ * @author Helmut Lehmeyer - Initial contribution
+ */
+@NonNullByDefault
+public class StringUtilsExt {
+ private static final Logger LOGGER = LoggerFactory.getLogger(StringUtilsExt.class);
+
+ /**
+ * Replaces multiple found words with the given Array contents
+ *
+ * @param str String for replacement
+ * @param separate A String or Array to be replaced
+ * @param separators Array will be merged to str
+ * @return
+ */
+ public static final String replaceArrayMerge(String str, String separate, Object[] separators) {
+ for (int i = 0; i < separators.length; i++) {
+ str = str.replaceFirst(separate, (String) separators[i]);
+ }
+ return str;
+ }
+
+ /**
+ * @see #replaceArrayMerge(String str, String separate, Object[] separators)
+ */
+ public static final String replaceArrayMerge(String str, String[] separate, String[] separators) {
+ for (int i = 0; i < separators.length; i++) {
+ str = str.replaceFirst(separate[i], separators[i]);
+ }
+ return str;
+ }
+
+ /**
+ * @see #parseJdbcURL(String url, Properties def)
+ */
+ public static Properties parseJdbcURL(String url) {
+ return parseJdbcURL(url, null);
+ }
+
+ /**
+ * <b>JDBC-URI Examples:</b><br/>
+ * jdbc:dbShortcut:c:/dev/databaseName<br/>
+ * jdbc:dbShortcut:scheme:c:/dev/databaseName<br/>
+ * jdbc:dbShortcut:scheme:c:\\dev\\databaseName<br/>
+ * jdbc:dbShortcut:./databaseName<br/>
+ * jdbc:dbShortcut:/databaseName<br/>
+ * jdbc:dbShortcut:~/databaseName<br/>
+ * jdbc:dbShortcut:/path/databaseName.db<br/>
+ * jdbc:dbShortcut:./../../path/databaseName<br/>
+ * jdbc:dbShortcut:scheme:./path/../path/databaseName;param1=true;<br/>
+ * jdbc:dbShortcut://192.168.0.145:3306/databaseName?param1=false¶m2=true
+ * <p/>
+ *
+ * @param url JDBC-URI
+ * @param def Predefined Properties Object
+ * @return A merged Properties Object may contain:<br/>
+ * parseValid (mandatory)<br/>
+ * scheme<br/>
+ * serverPath<br/>
+ * dbShortcut<br/>
+ * databaseName<br/>
+ * portNumber<br/>
+ * serverName<br/>
+ * pathQuery<br/>
+ */
+ public static Properties parseJdbcURL(String url, @Nullable Properties def) {
+ Properties props;
+ if (def == null) {
+ props = new Properties();
+ } else {
+ props = new Properties(def);
+ }
+
+ if (url == null || url.length() < 9) {
+ return props;
+ }
+
+ // replace all \
+ if (url.contains("\\")) {
+ url = url.replaceAll("\\\\", "/");
+ }
+
+ // replace first ; with ?
+ if (url.contains(";")) {
+ // replace first ; with ?
+ url = url.replaceFirst(";", "?");
+ // replace other ; with &
+ url = url.replaceAll(";", "&");
+ }
+
+ if (url.split(":").length < 3 || url.indexOf("/") == -1) {
+ LOGGER.error("parseJdbcURL: URI '{}' is not well formated, expected uri like 'jdbc:dbShortcut:/path'", url);
+ props.put("parseValid", "false");
+ return props;
+ }
+
+ String[] protAndDb = stringBeforeSubstr(url, ":", 1).split(":");
+ if (!"jdbc".equals(protAndDb[0])) {
+ LOGGER.error("parseJdbcURL: URI '{}' is not well formated, expected suffix 'jdbc' found '{}'", url,
+ protAndDb[0]);
+ props.put("parseValid", "false");
+ return props;
+ }
+ props.put("parseValid", "true");
+ props.put("dbShortcut", protAndDb[1]);
+
+ URI dbURI = null;
+ try {
+ dbURI = new URI(stringAfterSubstr(url, ":", 1).replaceFirst(" ", ""));
+ if (dbURI.getScheme() != null) {
+ props.put("scheme", dbURI.getScheme());
+ dbURI = new URI(stringAfterSubstr(url, ":", 2).replaceFirst(" ", ""));
+ }
+ } catch (URISyntaxException e) {
+ LOGGER.error("parseJdbcURL: URI '{}' is not well formated.", url, e);
+ return props;
+ }
+
+ // Query-Parameters
+ if (dbURI.getQuery() != null) {
+ String[] q = dbURI.getQuery().split("&");
+ for (int i = 0; i < q.length; i++) {
+ String[] t = q[i].split("=");
+ props.put(t[0], t[1]);
+ }
+ props.put("pathQuery", dbURI.getQuery());
+ }
+
+ String path = "";
+ if (dbURI.getPath() != null) {
+ String gp = dbURI.getPath();
+ String st = "/";
+ if (gp.indexOf("/") <= 1) {
+ if (substrPos(gp, st).size() > 1) {
+ path = stringBeforeLastSubstr(gp, st) + st;
+ } else {
+ path = stringBeforeSubstr(gp, st) + st;
+ }
+ }
+ if (dbURI.getScheme() != null && dbURI.getScheme().length() == 1) {
+ path = dbURI.getScheme() + ":" + path;
+ }
+ props.put("serverPath", path);
+ }
+ if (dbURI.getPath() != null) {
+ props.put("databaseName", stringAfterLastSubstr(dbURI.getPath(), "/"));
+ }
+ if (dbURI.getPort() != -1) {
+ props.put("portNumber", dbURI.getPort() + "");
+ }
+ if (dbURI.getHost() != null) {
+ props.put("serverName", dbURI.getHost());
+ }
+
+ return props;
+ }
+
+ /**
+ * Returns a String before the last occurrence of a substring
+ */
+ public static String stringBeforeLastSubstr(String s, String substr) {
+ List<Integer> a = substrPos(s, substr);
+ return s.substring(0, a.get(a.size() - 1));
+ }
+
+ /**
+ * Returns a String after the last occurrence of a substring
+ */
+ public static String stringAfterLastSubstr(String s, String substr) {
+ List<Integer> a = substrPos(s, substr);
+ return s.substring(a.get(a.size() - 1) + 1);
+ }
+
+ /**
+ * Returns a String after the first occurrence of a substring
+ */
+ public static String stringAfterSubstr(String s, String substr) {
+ return s.substring(s.indexOf(substr) + 1);
+ }
+
+ /**
+ * Returns a String after the n occurrence of a substring
+ */
+ public static String stringAfterSubstr(String s, String substr, int n) {
+ return s.substring(substrPos(s, substr).get(n) + 1);
+ }
+
+ /**
+ * Returns a String before the first occurrence of a substring
+ */
+ public static String stringBeforeSubstr(String s, String substr) {
+ return s.substring(0, s.indexOf(substr));
+ }
+
+ /**
+ * Returns a String before the n occurrence of a substring.
+ */
+ public static String stringBeforeSubstr(String s, String substr, int n) {
+ return s.substring(0, substrPos(s, substr).get(n));
+ }
+
+ /**
+ * Returns a list with indices of the occurrence of a substring.
+ */
+ public static List<Integer> substrPos(String s, String substr) {
+ return substrPos(s, substr, true);
+ }
+
+ /**
+ * Returns a list with indices of the occurrence of a substring.
+ */
+ public static List<Integer> substrPos(String s, String substr, boolean ignoreCase) {
+ int substrLength = substr.length();
+ int strLength = s.length();
+ List<Integer> arr = new ArrayList<>();
+
+ for (int i = 0; i < strLength - substrLength + 1; i++) {
+ if (s.regionMatches(ignoreCase, i, substr, 0, substrLength)) {
+ arr.add(i);
+ }
+ }
+ return arr;
+ }
+
+ /*
+ * (non-Javadoc)
+ *
+ * @see java.lang.Object#toString()
+ */
+ public static String filterToString(FilterCriteria filter) {
+ StringBuilder builder = new StringBuilder();
+ builder.append("FilterCriteria [itemName=");
+ builder.append(filter.getItemName());
+ builder.append(", beginDate=");
+ builder.append(filter.getBeginDate());
+ builder.append(", endDate=");
+ builder.append(filter.getEndDate());
+ builder.append(", pageNumber=");
+ builder.append(filter.getPageNumber());
+ builder.append(", pageSize=");
+ builder.append(filter.getPageSize());
+ builder.append(", operator=");
+ builder.append(filter.getOperator());
+ builder.append(", ordering=");
+ builder.append(filter.getOrdering());
+ builder.append(", state=");
+ builder.append(filter.getState());
+ builder.append("]");
+ return builder.toString();
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<config-description:config-descriptions
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:config-description="https://openhab.org/schemas/config-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/config-description/v1.0.0
+ https://openhab.org/schemas/config-description-1.0.0.xsd">
+
+ <config-description uri="persistence:jdbc">
+
+ <!--
+ # I N S T A L L J D B C P E R S I S T E N C E S E R V I C E
+ #
+ # https://github.com/openhab/openhab/wiki/JDBC-Persistence
+ #
+ # Tested databases/url-prefix: jdbc:derby, jdbc:h2, jdbc:hsqldb, jdbc:mariadb, jdbc:mysql, jdbc:postgresql, jdbc:sqlite
+ #
+ # derby, h2, hsqldb, sqlite can be embedded,
+ # If no database is available it will be created, for example the url 'jdbc:h2:./testH2' creates a new DB in OpenHab Folder.
+ #
+ # Create new database, for example on a MySQL-Server use:
+ # CREATE DATABASE 'yourDB' CHARACTER SET utf8 COLLATE utf8_general_ci;
+ -->
+
+ <!--
+ # D A T A B A S E C O N F I G
+ # Some URL-Examples, 'service' identifies and activates internally the correct jdbc driver.
+ # required database url like 'jdbc:<service>:<host>[:<port>;<attributes>]'
+ # jdbc:url=jdbc:derby:./testDerby;create=true
+ # jdbc:url=jdbc:h2:./testH2
+ # jdbc:url=jdbc:hsqldb:./testHsqlDb
+ # jdbc:url=jdbc:mariadb://192.168.0.1:3306/testMariadb
+ # jdbc:url=jdbc:mysql://192.168.0.1:3306/testMysql
+ # jdbc:url=jdbc:postgresql://192.168.0.1:5432/testPostgresql
+ # jdbc:url=jdbc:sqlite:./testSqlite.db
+ -->
+ <parameter name="url" type="text" required="true">
+ <label>Database URL</label>
+ <description><![CDATA[Defines required database URL and optional path and parameters.<br>
+ Required database url like 'jdbc:<service>:<host>[:<port>;<attributes>]'<br>
+ Parameter 'service' is used as identifier for the selected jdbc driver.
+ URL-Examples:<br>
+ jdbc:derby:./testDerby;create=true<br>
+ jdbc:h2:./testH2<br>
+ jdbc:hsqldb:./testHsqlDb<br>
+ jdbc:mariadb://192.168.0.1:3306/testMariadb<br>
+ jdbc:mysql://192.168.0.1:3306/testMysql<br>
+ jdbc:postgresql://192.168.0.1:5432/testPostgresql<br>
+ jdbc:sqlite:./testSqlite.db]]></description>
+ </parameter>
+
+ <parameter name="user" type="text" required="false">
+ <label>Database User</label>
+ <description><![CDATA[Defines optional database user.]]></description>
+ </parameter>
+
+ <parameter name="password" type="text" required="false">
+ <label>Database Password</label>
+ <description><![CDATA[Defines optional database password.]]></description>
+ </parameter>
+
+ <!--
+ # I T E M O P E R A T I O N S
+ # optional tweaking SQL datatypes
+ # see: https://mybatis.github.io/mybatis-3/apidocs/reference/org/apache/ibatis/type/JdbcType.html
+ # see: http://www.h2database.com/html/datatypes.html
+ # see: http://www.postgresql.org/docs/9.3/static/datatype.html
+ # defaults:
+ #sqltype.CALL = VARCHAR(200)
+ #sqltype.COLOR = VARCHAR(70)
+ #sqltype.CONTACT = VARCHAR(6)
+ #sqltype.DATETIME = DATETIME
+ #sqltype.DIMMER = TINYINT
+ #sqltype.LOCATION = VARCHAR(30)
+ #sqltype.NUMBER = DOUBLE
+ #sqltype.ROLLERSHUTTER = TINYINT
+ #sqltype.STRING = VARCHAR(65500)
+ #sqltype.SWITCH = VARCHAR(6)
+
+ # For Itemtype "Number" default decimal digit count (optional, default: 3)
+ #numberDecimalcount=
+ -->
+ <parameter name="sqltype.CALL" type="text" required="false">
+ <label>SqlType CALL</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for CALL <br>(optional, default: "VARCHAR(200)"). <br>
+ General about JdbcTypes/SqlTypes see: https://mybatis.github.io/mybatis-3/apidocs/reference/org/apache/ibatis/type/JdbcType.html <br>
+ see: http://www.h2database.com/html/datatypes.html <br>
+ see: http://www.postgresql.org/docs/9.5/static/datatype.html]]></description>
+ </parameter>
+ <parameter name="sqltype.COLOR" type="text" required="false">
+ <label>SqlType COLOR</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for COLOR <br>(optional, default: "VARCHAR(70)").]]></description>
+ </parameter>
+ <parameter name="sqltype.CONTACT" type="text" required="false">
+ <label>SqlType CONTACT</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for CONTACT <br>(optional, default: "VARCHAR(6)").]]></description>
+ </parameter>
+ <parameter name="sqltype.DATETIME" type="text" required="false">
+ <label>SqlType DATETIME</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for DATETIME <br>(optional, default: "DATETIME").]]></description>
+ </parameter>
+ <parameter name="sqltype.DIMMER" type="text" required="false">
+ <label>SqlType DIMMER</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for DIMMER <br>(optional, default: "TINYINT").]]></description>
+ </parameter>
+ <parameter name="sqltype.LOCATION" type="text" required="false">
+ <label>SqlType LOCATION</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for LOCATION <br>(optional, default: "VARCHAR(30)").]]></description>
+ </parameter>
+ <parameter name="sqltype.NUMBER" type="text" required="false">
+ <label>SqlType NUMBER</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for NUMBER <br>(optional, default: "DOUBLE").]]></description>
+ </parameter>
+ <parameter name="sqltype.ROLLERSHUTTER" type="text" required="false">
+ <label>SqlType ROLLERSHUTTER</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for ROLLERSHUTTER <br>(optional, default: "TINYINT").]]></description>
+ </parameter>
+ <parameter name="sqltype.STRING" type="text" required="false">
+ <label>SqlType STRING</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for STRING <br>(optional, default: "VARCHAR(65500)").]]></description>
+ </parameter>
+ <parameter name="sqltype.SWITCH" type="text" required="false">
+ <label>SqlType SWITCH</label>
+ <description><![CDATA[Overrides used JDBC/SQL datatype for SWITCH <br>(optional, default: "VARCHAR(6)").]]></description>
+ </parameter>
+
+ <!--
+ # T A B L E O P E R A T I O N S
+ # Tablename Prefix String (optional, default: "item")
+ # for Migration from MYSQL-Bundle set to 'Item'.
+ #tableNamePrefix=Item
+
+ # Tablename Prefix generation, using Item real names or "item" (optional, default: false -> "item")
+ # If true, 'tableNamePrefix' is ignored.
+ #tableUseRealItemNames=
+ tableUseRealItemNames=true
+
+ # Tablename Suffix length (optional, default: 4 -> 0001-9999)
+ # for Migration from MYSQL-Bundle set to 0.
+ #tableIdDigitCount=
+
+ # Rename existing Tables using tableUseRealItemNames and tableIdDigitCount (optional, default: false)
+ # USE WITH CARE! Deactivate after Renaming is done!
+ #rebuildTableNames=true
+ -->
+ <parameter name="tableNamePrefix" type="text" required="false">
+ <label>Tablename Prefix String</label>
+ <description><![CDATA[Tablename prefix string <br>(optional, default: "item"). <br>
+ For migration from MYSQL-Bundle set to 'Item'.]]></description>
+ </parameter>
+ <parameter name="tableUseRealItemNames" type="text" required="false">
+ <label>Tablename Realname Generation</label>
+ <description><![CDATA[Enables Tablename prefix generation per Items realname <br>(optional, default: disabled -> "Tablename Prefix String" is used). <br>
+ If true, 'Tablename Prefix String' is ignored.]]></description>
+ <options>
+ <option value="true">Enable</option>
+ <option value="false">Disable</option>
+ </options>
+ </parameter>
+ <parameter name="tableIdDigitCount" type="text" required="false">
+ <label>Tablename Suffix ID Count</label>
+ <description><![CDATA[Tablename Suffix ID Count <br>(optional, default: 4 -> 0001-9999). <br>
+ For migration from MYSQL-Bundle set to 0.]]></description>
+ </parameter>
+ <parameter name="rebuildTableNames" type="text" required="false">
+ <label>Tablename Rebuild</label>
+ <description><![CDATA[Rename existing tables using 'Tablename Realname Generation' and 'Tablename Suffix ID Count', (optional, default: disabled). <br>
+ USE WITH CARE! Deactivate after renaming is done!]]></description>
+ <options>
+ <option value="true">Enable</option>
+ <option value="false">Disable</option>
+ </options>
+ </parameter>
+
+ <!--
+ # D A T A B A S E C O N N E C T I O N S
+ # Some embeded Databases can handle only one Connection (optional, default: configured per database in packet org.openhab.persistence.jdbc.db.*
+ )
+ # see: https://github.com/brettwooldridge/HikariCP/issues/256
+ # maximumPoolSize = 1
+ # minimumIdle = 1
+ -->
+ <parameter name="maximumPoolSize" type="text" required="false">
+ <label>Connections Max Pool Size</label>
+ <description><![CDATA[Overrides max pool size in database connection. <br>(optional, default: differs each Database)<br>
+ https://github.com/brettwooldridge/HikariCP/issues/256]]></description>
+ </parameter>
+ <parameter name="minimumIdle" type="text" required="false">
+ <label>Connections Min Idle</label>
+ <description><![CDATA[Overrides min idle database connections. <br>(optional, default: differs each Database)<br>
+ https://github.com/brettwooldridge/HikariCP/issues/256]]></description>
+ </parameter>
+
+ <!--
+ # T I M E K E E P I N G
+ # (optional, default: false)
+ #enableLogTime=true
+ -->
+ <parameter name="enableLogTime" type="text" required="false">
+ <label>Timekeeping Enable</label>
+ <description><![CDATA[Enables a time, performance measurement. <br>(optional, default: disabled)]]></description>
+ <options>
+ <option value="true">Enable</option>
+ <option value="false">Disable</option>
+ </options>
+ </parameter>
+
+ </config-description>
+
+</config-description:config-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.jpa</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# Java Persistence API (JPA) Persistence
+
+This service allows you to persist state updates using a SQL or NoSQL database through the [Java Persistence API](https://en.wikipedia.org/wiki/Java_Persistence_API).
+The service uses an abstraction layer that theoretically allows it to support many available SQL or NoSQL databases.
+
+It will create one table named `historic_item` where all item states are stored.
+The item state is stored in a string representation.
+
+The service currently supports MySQL, Apache Derby and PostgreSQL databases.
+Only the embedded Apache Derby database driver is included.
+Other drivers must be installed manually.
+(See below for more information on that.)
+
+## Configuration
+
+This service can be configured in the file `services/jpa.cfg`.
+
+| Property | Default | Required | Description |
+| -------- | ------- | :-------: | ------------------------------------------------------------ |
+| url | | Yes | JDBC connection URL. Examples:<br/><br/>`jdbc:postgresql://hab.local:5432/openhab`<br/>`jdbc:derby://hab.local:1527/openhab;create=true`<br/>`jdbc:mysql://localhost:3306/openhab` |
+| driver | | Yes | database driver. Examples:<br/><br/>`org.postgresql.Driver`<br/>`org.apache.derby.jdbc.ClientDriver`<br/>`com.mysql.jdbc.Driver`<br/></br>Only the Apache Derby driver is included with the service. Drivers for other databases must be installed manually. This is a trivial process. Normally JDBC database drivers are packaged as OSGi bundles and can just be dropped into the `addons` folder. This has the advantage that users can update their drivers as needed. The following database drivers are known to work:<br/><br/>`postgresql-9.4-1203-jdbc41.jar`<br/>`postgresql-9.4-1206-jdbc41.jar` |
+| user | | if needed | database user name for connection |
+| password | | if needed | database user password for connection |
+
+## Adding support for other JPA supported databases
+
+All item- and event-related configuration is done in the file `persistence/jpa.persist`.
+
+If a database driver is not an OSGi bundle, the technique below can be used to extend the openHAB classpath.
+
+Other database drivers can be added by expanding the openHAB classpath.
+
+Use the following classpath setup in start.sh / start_debug.sh of openHAB:
+
+```
+cp=$(echo lib/*.jar | tr ' ' ':'):$(find $eclipsehome -name "org.eclipse.equinox.launcher_*.jar" | sort | tail -1);
+```
+
+This will add all .jar files in a folder "lib" in the root of openhab.
+
+All databases that are supported by JPA can be added.
+
+Define `driver` and `url` according to the database definitions.
+
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.jpa</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: JPA</name>
+
+ <properties>
+ <bnd.importpackage>!com.ibm.*,!com.sun.*,!oracle.*,!org.apache.bval.*,!org.apache.geronimo.*,!org.apache.avalon.*,!org.apache.log,!org.apache.tools.*,!org.apache.xerces.*,!org.jboss.*,!org.postgresql.*,!org.slf4j.impl,!weblogic.*,!javax.rmi</bnd.importpackage>
+ </properties>
+
+ <dependencies>
+ <!-- https://mvnrepository.com/artifact/org.apache.openjpa/openjpa-all -->
+ <dependency>
+ <groupId>org.apache.openjpa</groupId>
+ <artifactId>openjpa-all</artifactId>
+ <version>2.4.0</version>
+ </dependency>
+ <!-- https://mvnrepository.com/artifact/org.apache.derby/derby -->
+ <dependency>
+ <groupId>org.apache.derby</groupId>
+ <artifactId>derby</artifactId>
+ <version>10.11.1.1</version>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.openjpa</groupId>
+ <artifactId>openjpa-maven-plugin</artifactId>
+ <version>3.1.0</version>
+ <configuration>
+ <excludes>org/apache/bval/**</excludes>
+ <includes>**/model/*.class</includes>
+ <addDefaultConstructor>true</addDefaultConstructor>
+ <enforcePropertyRestrictions>true</enforcePropertyRestrictions>
+ </configuration>
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.openjpa</groupId>
+ <artifactId>openjpa</artifactId>
+ <!-- set the version to be the same as the level in your runtime -->
+ <version>3.1.0</version>
+ </dependency>
+ </dependencies>
+ <executions>
+ <execution>
+ <id>enhancer</id>
+ <goals>
+ <goal>enhance</goal>
+ </goals>
+ <phase>process-classes</phase>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.jpa-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-persistence-jpa" description="JPA Persistence" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.jpa/${project.version}</bundle>
+ <configfile finalname="${openhab.conf}/services/jpa.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/jpa</configfile>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.util.Map;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * The configuration required for Jpa binding.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ * @author Kai Kreuzer - migrated to 3.x
+ *
+ */
+public class JpaConfiguration {
+ private final Logger logger = LoggerFactory.getLogger(JpaConfiguration.class);
+
+ private static final String CFG_CONNECTION_URL = "url";
+ private static final String CFG_DRIVER_CLASS = "driver";
+ private static final String CFG_USERNAME = "user";
+ private static final String CFG_PASSWORD = "password";
+ private static final String CFG_SYNCMAPPING = "syncmappings";
+
+ public static boolean isInitialized = false;
+
+ public final String dbConnectionUrl;
+ public final String dbDriverClass;
+ public final String dbUserName;
+ public final String dbPassword;
+ public final String dbSyncMapping;
+
+ public JpaConfiguration(final Map<String, Object> properties) {
+ logger.debug("Update config...");
+
+ String param = (String) properties.get(CFG_CONNECTION_URL);
+ logger.debug("url: {}", param);
+ if (param == null) {
+ logger.warn("Connection url is required in jpa.cfg!");
+ } else if (param.isBlank()) {
+ logger.warn("Empty connection url in jpa.cfg!");
+ }
+ dbConnectionUrl = param;
+
+ param = (String) properties.get(CFG_DRIVER_CLASS);
+ logger.debug("driver: {}", param);
+ if (param == null) {
+ logger.warn("Driver class is required in jpa.cfg!");
+ } else if (param.isBlank()) {
+ logger.warn("Empty driver class in jpa.cfg!");
+ }
+ dbDriverClass = param;
+
+ if (properties.get(CFG_USERNAME) == null) {
+ logger.info("{} was not specified!", CFG_USERNAME);
+ }
+ dbUserName = (String) properties.get(CFG_USERNAME);
+
+ if (properties.get(CFG_PASSWORD) == null) {
+ logger.info("{} was not specified!", CFG_PASSWORD);
+ }
+ dbPassword = (String) properties.get(CFG_PASSWORD);
+
+ if (properties.get(CFG_SYNCMAPPING) == null) {
+ logger.debug("{} was not specified!", CFG_SYNCMAPPING);
+ }
+ dbSyncMapping = (String) properties.get(CFG_SYNCMAPPING);
+
+ isInitialized = true;
+ logger.debug("Update config... done");
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.text.DateFormat;
+import java.time.Instant;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.ArrayList;
+import java.util.List;
+
+import org.openhab.core.items.Item;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.LocationItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.library.types.StringListType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.persistence.jpa.internal.model.JpaPersistentItem;
+
+/**
+ * The historic item as returned when querying the service.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ *
+ */
+public class JpaHistoricItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final ZonedDateTime timestamp;
+
+ public JpaHistoricItem(String name, State state, ZonedDateTime timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+ /**
+ * This method maps a jpa result item to this historic item.
+ *
+ * @param jpaQueryResult the result which jpa items
+ * @param item used for query information, like the state (State)
+ * @return list of historic items
+ */
+ public static List<HistoricItem> fromResultList(List<JpaPersistentItem> jpaQueryResult, Item item) {
+ List<HistoricItem> ret = new ArrayList<>();
+ for (JpaPersistentItem i : jpaQueryResult) {
+ HistoricItem hi = fromPersistedItem(i, item);
+ ret.add(hi);
+ }
+ return ret;
+ }
+
+ /**
+ * Converts the string value of the persisted item to the state of a HistoricItem.
+ *
+ * @param pItem the persisted JpaPersistentItem
+ * @param item the source reference Item
+ * @return historic item
+ */
+ public static HistoricItem fromPersistedItem(JpaPersistentItem pItem, Item item) {
+ State state;
+ if (item instanceof NumberItem) {
+ state = new DecimalType(Double.valueOf(pItem.getValue()));
+ } else if (item instanceof DimmerItem) {
+ state = new PercentType(Integer.valueOf(pItem.getValue()));
+ } else if (item instanceof SwitchItem) {
+ state = OnOffType.valueOf(pItem.getValue());
+ } else if (item instanceof ContactItem) {
+ state = OpenClosedType.valueOf(pItem.getValue());
+ } else if (item instanceof RollershutterItem) {
+ state = PercentType.valueOf(pItem.getValue());
+ } else if (item instanceof DateTimeItem) {
+ state = new DateTimeType(ZonedDateTime.ofInstant(Instant.ofEpochMilli(Long.valueOf(pItem.getValue())),
+ ZoneId.systemDefault()));
+ } else if (item instanceof LocationItem) {
+ PointType pType = null;
+ String[] comps = pItem.getValue().split(";");
+ if (comps.length >= 2) {
+ pType = new PointType(new DecimalType(comps[0]), new DecimalType(comps[1]));
+
+ if (comps.length == 3) {
+ pType.setAltitude(new DecimalType(comps[2]));
+ }
+ }
+ state = pType;
+ } else if (item instanceof StringListType) {
+ state = new StringListType(pItem.getValue());
+ } else {
+ state = new StringType(pItem.getValue());
+ }
+
+ return new JpaHistoricItem(item.getName(), state, pItem.getTimestamp());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.util.Collections;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import javax.persistence.EntityManager;
+import javax.persistence.EntityManagerFactory;
+import javax.persistence.Persistence;
+import javax.persistence.Query;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.UnDefType;
+import org.openhab.persistence.jpa.internal.model.JpaPersistentItem;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * JPA based implementation of QueryablePersistenceService.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.jpa", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class JpaPersistenceService implements QueryablePersistenceService {
+ private final Logger logger = LoggerFactory.getLogger(JpaPersistenceService.class);
+
+ private final ItemRegistry itemRegistry;
+
+ private @Nullable EntityManagerFactory emf = null;
+
+ private @NonNullByDefault({}) JpaConfiguration config;
+
+ @Activate
+ public JpaPersistenceService(final @Reference ItemRegistry itemRegistry) {
+ this.itemRegistry = itemRegistry;
+ }
+
+ /**
+ * lazy loading because update() is called after activate()
+ *
+ * @return EntityManagerFactory
+ */
+ protected @Nullable EntityManagerFactory getEntityManagerFactory() {
+ if (emf == null) {
+ emf = newEntityManagerFactory();
+ }
+ return emf;
+ }
+
+ @Activate
+ public void activate(BundleContext context, Map<String, Object> properties) {
+ logger.debug("Activating jpa persistence service");
+ config = new JpaConfiguration(properties);
+ }
+
+ /**
+ * Closes the EntityPersistenceFactory
+ */
+ @Deactivate
+ public void deactivate() {
+ logger.debug("Deactivating jpa persistence service");
+ closeEntityManagerFactory();
+ }
+
+ @Override
+ public String getId() {
+ return "jpa";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "JPA";
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ logger.debug("Storing item: {}", item.getName());
+
+ if (item.getState() instanceof UnDefType) {
+ logger.debug("This item is of undefined type. Cannot persist it!");
+ return;
+ }
+
+ if (!JpaConfiguration.isInitialized) {
+ logger.debug("Trying to create EntityManagerFactory but we don't have configuration yet!");
+ return;
+ }
+
+ // determine item name to be stored
+ String name = (alias != null) ? alias : item.getName();
+
+ JpaPersistentItem pItem = new JpaPersistentItem();
+ try {
+ String newValue = StateHelper.toString(item.getState());
+ pItem.setValue(newValue);
+ logger.debug("Stored new value: {}", newValue);
+ } catch (Exception e1) {
+ logger.error("Error on converting state value to string: {}", e1.getMessage());
+ return;
+ }
+ pItem.setName(name);
+ pItem.setRealName(item.getName());
+ pItem.setTimestamp(new Date());
+
+ EntityManager em = getEntityManagerFactory().createEntityManager();
+ try {
+ logger.debug("Persisting item...");
+ // In RESOURCE_LOCAL calls to EntityManager require a begin/commit
+ em.getTransaction().begin();
+ em.persist(pItem);
+ em.getTransaction().commit();
+ logger.debug("Persisting item...done");
+ } catch (Exception e) {
+ logger.error("Error on persisting item! Rolling back!", e);
+ em.getTransaction().rollback();
+ } finally {
+ em.close();
+ }
+
+ logger.debug("Storing item...done");
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ logger.debug("Querying for historic item: {}", filter.getItemName());
+
+ if (!JpaConfiguration.isInitialized) {
+ logger.warn("Trying to create EntityManagerFactory but we don't have configuration yet!");
+ return Collections.emptyList();
+ }
+
+ String itemName = filter.getItemName();
+ Item item = getItemFromRegistry(itemName);
+
+ String sortOrder;
+ if (filter.getOrdering() == Ordering.ASCENDING) {
+ sortOrder = "ASC";
+ } else {
+ sortOrder = "DESC";
+ }
+
+ boolean hasBeginDate = false;
+ boolean hasEndDate = false;
+ String queryString = "SELECT n FROM " + JpaPersistentItem.class.getSimpleName()
+ + " n WHERE n.realName = :itemName";
+ if (filter.getBeginDate() != null) {
+ queryString += " AND n.timestamp >= :beginDate";
+ hasBeginDate = true;
+ }
+ if (filter.getEndDate() != null) {
+ queryString += " AND n.timestamp <= :endDate";
+ hasEndDate = true;
+ }
+ queryString += " ORDER BY n.timestamp " + sortOrder;
+
+ logger.debug("The query: {}", queryString);
+
+ EntityManager em = getEntityManagerFactory().createEntityManager();
+ try {
+ // In RESOURCE_LOCAL calls to EntityManager require a begin/commit
+ em.getTransaction().begin();
+
+ logger.debug("Creating query...");
+ Query query = em.createQuery(queryString);
+ query.setParameter("itemName", item.getName());
+ if (hasBeginDate) {
+ query.setParameter("beginDate", Date.from(filter.getBeginDate().toInstant()));
+ }
+ if (hasEndDate) {
+ query.setParameter("endDate", Date.from(filter.getEndDate().toInstant()));
+ }
+
+ query.setFirstResult(filter.getPageNumber() * filter.getPageSize());
+ query.setMaxResults(filter.getPageSize());
+ logger.debug("Creating query...done");
+
+ logger.debug("Retrieving result list...");
+ @SuppressWarnings("unchecked")
+ List<JpaPersistentItem> result = query.getResultList();
+ logger.debug("Retrieving result list...done");
+
+ List<HistoricItem> historicList = JpaHistoricItem.fromResultList(result, item);
+ logger.debug("{}", String.format("Convert to HistoricItem: %d", historicList.size()));
+
+ em.getTransaction().commit();
+
+ return historicList;
+ } catch (Exception e) {
+ logger.error("Error on querying database!", e);
+ em.getTransaction().rollback();
+ } finally {
+ em.close();
+ }
+
+ return Collections.emptyList();
+ }
+
+ /**
+ * Creates a new EntityManagerFactory with properties read from openhab.cfg via JpaConfiguration.
+ *
+ * @return initialized EntityManagerFactory
+ */
+ protected EntityManagerFactory newEntityManagerFactory() {
+ logger.trace("Creating EntityManagerFactory...");
+
+ Map<String, String> properties = new HashMap<>();
+ properties.put("javax.persistence.jdbc.url", config.dbConnectionUrl);
+ properties.put("javax.persistence.jdbc.driver", config.dbDriverClass);
+ if (config.dbUserName != null) {
+ properties.put("javax.persistence.jdbc.user", config.dbUserName);
+ }
+ if (config.dbPassword != null) {
+ properties.put("javax.persistence.jdbc.password", config.dbPassword);
+ }
+ if (config.dbUserName != null && config.dbPassword == null) {
+ logger.warn("JPA persistence - it is recommended to use a password to protect data store");
+ }
+ if (config.dbSyncMapping != null && !config.dbSyncMapping.isBlank()) {
+ logger.warn("You are settings openjpa.jdbc.SynchronizeMappings, I hope you know what you're doing!");
+ properties.put("openjpa.jdbc.SynchronizeMappings", config.dbSyncMapping);
+ }
+
+ EntityManagerFactory fac = Persistence.createEntityManagerFactory(getPersistenceUnitName(), properties);
+ logger.debug("Creating EntityManagerFactory...done");
+
+ return fac;
+ }
+
+ /**
+ * Closes EntityManagerFactory
+ */
+ protected void closeEntityManagerFactory() {
+ if (emf != null) {
+ emf.close();
+ emf = null;
+ }
+ logger.debug("Closing down entity objects...done");
+ }
+
+ /**
+ * Checks if EntityManagerFactory is open
+ *
+ * @return true when open, false otherwise
+ */
+ protected boolean isEntityManagerFactoryOpen() {
+ return emf != null && emf.isOpen();
+ }
+
+ /**
+ * Return the persistence unit as in persistence.xml file.
+ *
+ * @return the persistence unit name
+ */
+ protected String getPersistenceUnitName() {
+ return "default";
+ }
+
+ /**
+ * Retrieves the item for the given name from the item registry
+ *
+ * @param itemName
+ * @return item
+ */
+ private @Nullable Item getItemFromRegistry(String itemName) {
+ try {
+ return itemRegistry.getItem(itemName);
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item type for {}", itemName);
+ }
+ return null;
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal;
+
+import java.util.Locale;
+
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.PointType;
+import org.openhab.core.types.State;
+
+/**
+ * Helper class for dealing with State
+ *
+ * @author Manfred Bergmann - Initial contribution
+ *
+ */
+public class StateHelper {
+
+ /**
+ * Converts the given State to a string that can be persisted in db
+ *
+ * @param state the state of the item to be persisted
+ * @return state converted as string
+ * @throws Exception
+ */
+ public static String toString(State state) throws Exception {
+ if (state instanceof DateTimeType) {
+ return String.valueOf(((DateTimeType) state).getZonedDateTime().toInstant().toEpochMilli());
+ }
+ if (state instanceof DecimalType) {
+ return String.valueOf(((DecimalType) state).doubleValue());
+ }
+ if (state instanceof PointType) {
+ PointType pType = (PointType) state;
+ return String.format(Locale.ENGLISH, "%f;%f;%f", pType.getLatitude().doubleValue(),
+ pType.getLongitude().doubleValue(), pType.getAltitude().doubleValue());
+ }
+
+ return state.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.jpa.internal.model;
+
+import java.text.DateFormat;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.Date;
+
+import javax.persistence.Column;
+import javax.persistence.Entity;
+import javax.persistence.GeneratedValue;
+import javax.persistence.GenerationType;
+import javax.persistence.Id;
+import javax.persistence.Table;
+import javax.persistence.Temporal;
+import javax.persistence.TemporalType;
+
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+
+/**
+ * This is the DAO object used for storing and retrieving to and from database.
+ *
+ * @author Manfred Bergmann - Initial contribution
+ *
+ */
+
+@Entity
+@Table(name = "HISTORIC_ITEM")
+public class JpaPersistentItem implements HistoricItem {
+
+ @Id
+ @GeneratedValue(strategy = GenerationType.AUTO)
+ private Long id;
+
+ private String name = "";
+ private String realName = "";
+ @Temporal(TemporalType.TIMESTAMP)
+ private Date timestamp = new Date();
+ @Column(length = 32672) // 32k, max varchar for apache derby
+ private String value = "";
+
+ public Long getId() {
+ return id;
+ }
+
+ public void setId(Long id) {
+ this.id = id;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public String getRealName() {
+ return realName;
+ }
+
+ public void setRealName(String realName) {
+ this.realName = realName;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return ZonedDateTime.ofInstant(timestamp.toInstant(), ZoneId.systemDefault());
+ }
+
+ public void setTimestamp(Date timestamp) {
+ this.timestamp = timestamp;
+ }
+
+ public String getValue() {
+ return value;
+ }
+
+ public void setValue(String value) {
+ this.value = value;
+ }
+
+ @Override
+ public State getState() {
+ return UnDefType.NULL;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(getTimestamp()) + ": " + getName() + " -> " + value;
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<persistence xmlns="http://java.sun.com/xml/ns/persistence" version="1.0">
+
+ <persistence-unit name="default" transaction-type="RESOURCE_LOCAL">
+ <provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
+ <non-jta-data-source/>
+ <class>org.openhab.persistence.jpa.internal.model.JpaPersistentItem</class>
+ <exclude-unlisted-classes>true</exclude-unlisted-classes>
+ <properties>
+ <property name="javax.persistence.jdbc.url" value="jdbc:postgresql:"/>
+ <property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver"/>
+ <property name="javax.persistence.jdbc.user" value=""/>
+ <property name="javax.persistence.jdbc.password" value=""/>
+ <property name="openjpa.jdbc.SynchronizeMappings" value="buildSchema(schemaAction='add')"/>
+ <property name="openjpa.Log" value="DefaultLevel=WARN, Tool=INFO"/>
+ </properties>
+ </persistence-unit>
+
+ <persistence-unit name="default_test" transaction-type="RESOURCE_LOCAL">
+ <provider>org.apache.openjpa.persistence.PersistenceProviderImpl</provider>
+ <non-jta-data-source/>
+ <class>org.openhab.persistence.jpa.internal.model.JpaPersistentItem</class>
+ <exclude-unlisted-classes>true</exclude-unlisted-classes>
+ <properties>
+ <property name="javax.persistence.jdbc.url" value="jdbc:derby:data;create=true"/>
+ <property name="javax.persistence.jdbc.driver" value="org.apache.derby.jdbc.EmbeddedDriver"/>
+ <property name="javax.persistence.jdbc.user" value="APP"/>
+ <property name="javax.persistence.jdbc.password" value="APP"/>
+ <property name="openjpa.jdbc.SynchronizeMappings" value="buildSchema(SchemaAction='drop,add')"/>
+ <property name="openjpa.Log" value="DefaultLevel=TRACE, Tool=INFO"/>
+ </properties>
+ </persistence-unit>
+
+</persistence>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="test" value="true"/>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.mapdb</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.mapdb</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: MapDB</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.mapdb</groupId>
+ <artifactId>mapdb</artifactId>
+ <version>1.0.9</version>
+ </dependency>
+ </dependencies>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.mapdb-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-persistence-mapdb" description="MapDB Persistence" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.mapdb/${project.version}</bundle>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb.internal;
+
+import java.text.DateFormat;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.Date;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+
+/**
+ * This is a Java bean used to persist item states with timestamps in the database.
+ *
+ * @author Jens Viebig - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class MapDbItem implements HistoricItem, PersistenceItemInfo {
+
+ private String name = "";
+ private State state = UnDefType.NULL;
+ private Date timestamp = new Date(0);
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ public void setState(State state) {
+ this.state = state;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return ZonedDateTime.ofInstant(timestamp.toInstant(), ZoneId.systemDefault());
+ }
+
+ public void setTimestamp(Date timestamp) {
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+
+ @Override
+ public @Nullable Integer getCount() {
+ return Integer.valueOf(1);
+ }
+
+ @Override
+ public @Nullable Date getEarliest() {
+ return null;
+ }
+
+ @Override
+ public @Nullable Date getLatest() {
+ return null;
+ }
+
+ public boolean isValid() {
+ return name != null && state != null && timestamp != null;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb.internal;
+
+import java.io.File;
+import java.util.Collections;
+import java.util.Date;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Optional;
+import java.util.Set;
+import java.util.concurrent.ExecutorService;
+import java.util.stream.Collectors;
+import java.util.stream.Stream;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.mapdb.DB;
+import org.mapdb.DBMaker;
+import org.openhab.core.OpenHAB;
+import org.openhab.core.common.ThreadPoolManager;
+import org.openhab.core.items.Item;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Deactivate;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
+/**
+ * This is the implementation of the MapDB {@link PersistenceService}. To learn more about MapDB please visit their
+ * <a href="http://www.mapdb.org/">website</a>.
+ *
+ * @author Jens Viebig - Initial contribution
+ * @author Martin Kühl - Port to 3.x
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class, QueryablePersistenceService.class })
+public class MapDbPersistenceService implements QueryablePersistenceService {
+
+ private static final String SERVICE_ID = "mapdb";
+ private static final String SERVICE_LABEL = "MapDB";
+ private static final String DB_FOLDER_NAME = OpenHAB.getUserDataFolder() + File.separator + "persistence"
+ + File.separator + "mapdb";
+ private static final String DB_FILE_NAME = "storage.mapdb";
+
+ private final Logger logger = LoggerFactory.getLogger(MapDbPersistenceService.class);
+
+ private final ExecutorService threadPool = ThreadPoolManager.getPool(getClass().getSimpleName());
+
+ /** holds the local instance of the MapDB database */
+
+ private @NonNullByDefault({}) DB db;
+ private @NonNullByDefault({}) Map<String, String> map;
+
+ private transient Gson mapper = new GsonBuilder().registerTypeHierarchyAdapter(State.class, new StateTypeAdapter())
+ .create();
+
+ @Activate
+ public void activate() {
+ logger.debug("MapDB persistence service is being activated");
+
+ File folder = new File(DB_FOLDER_NAME);
+ if (!folder.exists()) {
+ if (!folder.mkdirs()) {
+ logger.warn("Failed to create one or more directories in the path '{}'", DB_FOLDER_NAME);
+ logger.warn("MapDB persistence service activation has failed.");
+ return;
+ }
+ }
+
+ File dbFile = new File(DB_FOLDER_NAME, DB_FILE_NAME);
+ db = DBMaker.newFileDB(dbFile).closeOnJvmShutdown().make();
+ map = db.createTreeMap("itemStore").makeOrGet();
+ logger.debug("MapDB persistence service is now activated");
+ }
+
+ @Deactivate
+ public void deactivate() {
+ logger.debug("MapDB persistence service deactivated");
+ if (db != null) {
+ db.close();
+ }
+ threadPool.shutdown();
+ }
+
+ @Override
+ public String getId() {
+ return SERVICE_ID;
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return SERVICE_LABEL;
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ return map.values().stream().map(this::deserialize).flatMap(MapDbPersistenceService::streamOptional)
+ .collect(Collectors.<PersistenceItemInfo> toUnmodifiableSet());
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, item.getName());
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ if (item.getState() instanceof UnDefType) {
+ return;
+ }
+
+ // PersistenceManager passes SimpleItemConfiguration.alias which can be null
+ String localAlias = alias == null ? item.getName() : alias;
+ logger.debug("store called for {}", localAlias);
+
+ State state = item.getState();
+ MapDbItem mItem = new MapDbItem();
+ mItem.setName(localAlias);
+ mItem.setState(state);
+ mItem.setTimestamp(new Date());
+ String json = serialize(mItem);
+ map.put(localAlias, json);
+ commit();
+ logger.debug("Stored '{}' with state '{}' in MapDB database", localAlias, state.toString());
+ }
+
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ String json = map.get(filter.getItemName());
+ if (json == null) {
+ return Collections.emptyList();
+ }
+ Optional<MapDbItem> item = deserialize(json);
+ if (!item.isPresent()) {
+ return Collections.emptyList();
+ }
+ return Collections.singletonList(item.get());
+ }
+
+ private String serialize(MapDbItem item) {
+ return mapper.toJson(item);
+ }
+
+ @SuppressWarnings("null")
+ private Optional<MapDbItem> deserialize(String json) {
+ MapDbItem item = mapper.<MapDbItem> fromJson(json, MapDbItem.class);
+ if (item == null || !item.isValid()) {
+ logger.warn("Deserialized invalid item: {}", item);
+ return Optional.empty();
+ }
+ return Optional.of(item);
+ }
+
+ private void commit() {
+ threadPool.submit(() -> db.commit());
+ }
+
+ private static <T> Stream<T> streamOptional(Optional<T> opt) {
+ if (!opt.isPresent()) {
+ return Stream.empty();
+ }
+ return Stream.of(opt.get());
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb.internal;
+
+import java.io.IOException;
+import java.util.Collections;
+import java.util.List;
+
+import org.openhab.core.types.State;
+import org.openhab.core.types.TypeParser;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.google.gson.TypeAdapter;
+import com.google.gson.stream.JsonReader;
+import com.google.gson.stream.JsonToken;
+import com.google.gson.stream.JsonWriter;
+
+/**
+ * A GSON TypeAdapter for openHAB State values.
+ *
+ * @author Martin Kühl - Initial contribution
+ */
+public class StateTypeAdapter extends TypeAdapter<State> {
+ private static final String TYPE_SEPARATOR = "@@@";
+
+ private final Logger logger = LoggerFactory.getLogger(StateTypeAdapter.class);
+
+ @Override
+ public State read(JsonReader reader) throws IOException {
+ if (reader.peek() == JsonToken.NULL) {
+ reader.nextNull();
+ return null;
+ }
+ String value = reader.nextString();
+ String[] parts = value.split(TYPE_SEPARATOR);
+ String valueTypeName = parts[0];
+ String valueAsString = parts[1];
+
+ try {
+ @SuppressWarnings("unchecked")
+ Class<? extends State> valueType = (Class<? extends State>) Class.forName(valueTypeName);
+ List<Class<? extends State>> types = Collections.singletonList(valueType);
+ return TypeParser.parseState(types, valueAsString);
+ } catch (Exception e) {
+ logger.warn("Couldn't deserialize state '{}': {}", value, e.getMessage());
+ }
+ return null;
+ }
+
+ @Override
+ public void write(JsonWriter writer, State state) throws IOException {
+ if (state == null) {
+ writer.nullValue();
+ return;
+ }
+ String value = state.getClass().getName() + TYPE_SEPARATOR + state.toFullString();
+ writer.value(value);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb;
+
+import static org.hamcrest.CoreMatchers.*;
+import static org.hamcrest.MatcherAssert.assertThat;
+
+import org.junit.jupiter.api.Test;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.types.State;
+import org.openhab.persistence.mapdb.internal.StateTypeAdapter;
+
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
+/**
+ *
+ * @author Martin Kühl - Initial contribution
+ */
+public class StateTypeAdapterTest {
+ Gson mapper = new GsonBuilder().registerTypeHierarchyAdapter(State.class, new StateTypeAdapter()).create();
+
+ @Test
+ public void readWriteRoundtripShouldRecreateTheWrittenState() {
+ assertThat(roundtrip(OnOffType.ON), is(equalTo(OnOffType.ON)));
+ assertThat(roundtrip(PercentType.HUNDRED), is(equalTo(PercentType.HUNDRED)));
+ assertThat(roundtrip(HSBType.GREEN), is(equalTo(HSBType.GREEN)));
+ assertThat(roundtrip(StringType.valueOf("test")), is(equalTo(StringType.valueOf("test"))));
+ }
+
+ private State roundtrip(State state) {
+ return mapper.fromJson(mapper.toJson(state), State.class);
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="test" value="true"/>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.mongodb</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# MongoDB Persistence
+
+This service allows you to persist state updates using the MongoDB database.
+It supports writing information to a MongoDB document store, as well as querying from it.
+
+## Configuration
+
+This service can be configured in the file `services/mongodb.cfg`.
+
+| Property | Default | Required | Description |
+| ---------- | ------- | :------: | ---------------------------------------------------------------------------- |
+| url | | Yes | connection URL to address MongoDB. For example, `mongodb://localhost:27017` |
+| database | | Yes | database name |
+| collection | | Yes | collection name |
+
+All item and event related configuration is done in the file `persistence/mongodb.persist`.
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.mongodb</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: MongoDB</name>
+
+ <dependencies>
+ <!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->
+ <dependency>
+ <groupId>org.mongodb</groupId>
+ <artifactId>mongo-java-driver</artifactId>
+ <version>2.13.1</version>
+ </dependency>
+ </dependencies>
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.mongodb-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-persistence-mongodb" description="MongoDB Persistence" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.mongodb/${project.version}</bundle>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mongodb.internal;
+
+import java.text.DateFormat;
+import java.time.ZonedDateTime;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is the implementation of the MongoDB historic item.
+ *
+ * @author Thorsten Hoeger - Initial contribution
+ */
+@NonNullByDefault
+public class MongoDBItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final ZonedDateTime timestamp;
+
+ public MongoDBItem(String name, State state, ZonedDateTime timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mongodb.internal;
+
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.Date;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import org.bson.types.ObjectId;
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DateTimeItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Operator;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.framework.BundleContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.ConfigurationPolicy;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.mongodb.BasicDBObject;
+import com.mongodb.DBCollection;
+import com.mongodb.DBCursor;
+import com.mongodb.DBObject;
+import com.mongodb.MongoClient;
+import com.mongodb.MongoClientURI;
+
+/**
+ * This is the implementation of the MongoDB {@link PersistenceService}.
+ *
+ * @author Thorsten Hoeger - Initial contribution
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.mongodb", configurationPolicy = ConfigurationPolicy.REQUIRE)
+public class MongoDBPersistenceService implements QueryablePersistenceService {
+
+ private static final String FIELD_ID = "_id";
+ private static final String FIELD_ITEM = "item";
+ private static final String FIELD_REALNAME = "realName";
+ private static final String FIELD_TIMESTAMP = "timestamp";
+ private static final String FIELD_VALUE = "value";
+
+ private final Logger logger = LoggerFactory.getLogger(MongoDBPersistenceService.class);
+
+ private @NonNullByDefault({}) String url;
+ private @NonNullByDefault({}) String db;
+ private @NonNullByDefault({}) String collection;
+
+ private boolean initialized = false;
+
+ protected final ItemRegistry itemRegistry;
+
+ private @NonNullByDefault({}) MongoClient cl;
+ private @NonNullByDefault({}) DBCollection mongoCollection;
+
+ @Activate
+ public MongoDBPersistenceService(final @Reference ItemRegistry itemRegistry) {
+ this.itemRegistry = itemRegistry;
+ }
+
+ @Activate
+ public void activate(final BundleContext bundleContext, final Map<String, Object> config) {
+ url = (String) config.get("url");
+ logger.debug("MongoDB URL {}", url);
+ if (url == null || url.isBlank()) {
+ logger.warn("The MongoDB database URL is missing - please configure the mongodb:url parameter.");
+ return;
+ }
+ db = (String) config.get("database");
+ logger.debug("MongoDB database {}", db);
+ if (db == null || db.isBlank()) {
+ logger.warn("The MongoDB database name is missing - please configure the mongodb:database parameter.");
+ return;
+ }
+ collection = (String) config.get("collection");
+ logger.debug("MongoDB collection {}", collection);
+ if (collection == null || collection.isBlank()) {
+ logger.warn(
+ "The MongoDB database collection is missing - please configure the mongodb:collection parameter.");
+ return;
+ }
+
+ disconnectFromDatabase();
+ connectToDatabase();
+
+ // connection has been established... initialization completed!
+ initialized = true;
+ }
+
+ @Deactivate
+ public void deactivate(final int reason) {
+ logger.debug("MongoDB persistence bundle stopping. Disconnecting from database.");
+ disconnectFromDatabase();
+ }
+
+ @Override
+ public String getId() {
+ return "mongodb";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "Mongo DB";
+ }
+
+ @Override
+ public void store(Item item, @Nullable String alias) {
+ // Don't log undefined/uninitialized data
+ if (item.getState() instanceof UnDefType) {
+ return;
+ }
+
+ // If we've not initialized the bundle, then return
+ if (!initialized) {
+ logger.warn("MongoDB not initialized");
+ return;
+ }
+
+ // Connect to mongodb server if we're not already connected
+ if (!isConnected()) {
+ connectToDatabase();
+ }
+
+ // If we still didn't manage to connect, then return!
+ if (!isConnected()) {
+ logger.warn(
+ "mongodb: No connection to database. Cannot persist item '{}'! Will retry connecting to database next time.",
+ item);
+ return;
+ }
+
+ String realName = item.getName();
+ String name = (alias != null) ? alias : realName;
+ Object value = this.convertValue(item.getState());
+
+ DBObject obj = new BasicDBObject();
+ obj.put(FIELD_ID, new ObjectId());
+ obj.put(FIELD_ITEM, name);
+ obj.put(FIELD_REALNAME, realName);
+ obj.put(FIELD_TIMESTAMP, new Date());
+ obj.put(FIELD_VALUE, value);
+ this.mongoCollection.save(obj);
+
+ logger.debug("MongoDB save {}={}", name, value);
+ }
+
+ private Object convertValue(State state) {
+ Object value;
+ if (state instanceof PercentType) {
+ value = ((PercentType) state).toBigDecimal().doubleValue();
+ } else if (state instanceof DateTimeType) {
+ value = Date.from(((DateTimeType) state).getZonedDateTime().toInstant());
+ } else if (state instanceof DecimalType) {
+ value = ((DecimalType) state).toBigDecimal().doubleValue();
+ } else {
+ value = state.toString();
+ }
+ return value;
+ }
+
+ /**
+ * @{inheritDoc
+ */
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ /**
+ * Checks if we have a database connection
+ *
+ * @return true if connection has been established, false otherwise
+ */
+ private boolean isConnected() {
+ return cl != null;
+ }
+
+ /**
+ * Connects to the database
+ */
+ private void connectToDatabase() {
+ try {
+ logger.debug("Connect MongoDB");
+ this.cl = new MongoClient(new MongoClientURI(this.url));
+ mongoCollection = cl.getDB(this.db).getCollection(this.collection);
+
+ BasicDBObject idx = new BasicDBObject();
+ idx.append(FIELD_TIMESTAMP, 1).append(FIELD_ITEM, 1);
+ this.mongoCollection.createIndex(idx);
+ logger.debug("Connect MongoDB ... done");
+ } catch (Exception e) {
+ logger.error("Failed to connect to database {}", this.url);
+ throw new RuntimeException("Cannot connect to database", e);
+ }
+ }
+
+ /**
+ * Disconnects from the database
+ */
+ private void disconnectFromDatabase() {
+ this.mongoCollection = null;
+ if (this.cl != null) {
+ this.cl.close();
+ }
+ cl = null;
+ }
+
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ if (!initialized) {
+ return Collections.emptyList();
+ }
+
+ if (!isConnected()) {
+ connectToDatabase();
+ }
+
+ if (!isConnected()) {
+ return Collections.emptyList();
+ }
+
+ String name = filter.getItemName();
+ Item item = getItem(name);
+
+ List<HistoricItem> items = new ArrayList<>();
+ DBObject query = new BasicDBObject();
+ if (filter.getItemName() != null) {
+ query.put(FIELD_ITEM, filter.getItemName());
+ }
+ if (filter.getState() != null && filter.getOperator() != null) {
+ String op = convertOperator(filter.getOperator());
+ Object value = convertValue(filter.getState());
+ query.put(FIELD_VALUE, new BasicDBObject(op, value));
+ }
+ if (filter.getBeginDate() != null) {
+ query.put(FIELD_TIMESTAMP, new BasicDBObject("$gte", filter.getBeginDate()));
+ }
+ if (filter.getBeginDate() != null) {
+ query.put(FIELD_TIMESTAMP, new BasicDBObject("$lte", filter.getBeginDate()));
+ }
+
+ Integer sortDir = (filter.getOrdering() == Ordering.ASCENDING) ? 1 : -1;
+ DBCursor cursor = this.mongoCollection.find(query).sort(new BasicDBObject(FIELD_TIMESTAMP, sortDir))
+ .skip(filter.getPageNumber() * filter.getPageSize()).limit(filter.getPageSize());
+
+ while (cursor.hasNext()) {
+ BasicDBObject obj = (BasicDBObject) cursor.next();
+
+ final State state;
+ if (item instanceof NumberItem) {
+ state = new DecimalType(obj.getDouble(FIELD_VALUE));
+ } else if (item instanceof DimmerItem) {
+ state = new PercentType(obj.getInt(FIELD_VALUE));
+ } else if (item instanceof SwitchItem) {
+ state = OnOffType.valueOf(obj.getString(FIELD_VALUE));
+ } else if (item instanceof ContactItem) {
+ state = OpenClosedType.valueOf(obj.getString(FIELD_VALUE));
+ } else if (item instanceof RollershutterItem) {
+ state = new PercentType(obj.getInt(FIELD_VALUE));
+ } else if (item instanceof DateTimeItem) {
+ state = new DateTimeType(
+ ZonedDateTime.ofInstant(obj.getDate(FIELD_VALUE).toInstant(), ZoneId.systemDefault()));
+ } else {
+ state = new StringType(obj.getString(FIELD_VALUE));
+ }
+
+ items.add(new MongoDBItem(name, state,
+ ZonedDateTime.ofInstant(obj.getDate(FIELD_TIMESTAMP).toInstant(), ZoneId.systemDefault())));
+ }
+
+ return items;
+ }
+
+ private @Nullable String convertOperator(Operator operator) {
+ switch (operator) {
+ case EQ:
+ return "$eq";
+ case GT:
+ return "$gt";
+ case GTE:
+ return "$gte";
+ case LT:
+ return "$lt";
+ case LTE:
+ return "$lte";
+ case NEQ:
+ return "$neq";
+ default:
+ return null;
+ }
+ }
+
+ private @Nullable Item getItem(String itemName) {
+ try {
+ return itemRegistry.getItem(itemName);
+ } catch (ItemNotFoundException e1) {
+ logger.error("Unable to get item type for {}", itemName);
+ }
+ return null;
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return Collections.emptyList();
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="test" value="true"/>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.rrd4j</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# rrd4j Persistence
+
+The [rrd4j](https://github.com/rrd4j/rrd4j) persistence service is based on a round-robin database.
+
+In contrast to a "normal" database such as db4o, a round-robin database does not grow in size - it has a fixed allocated size.
+This is accomplished by saving a fixed amount of datapoints and by doing data compression, which means that the older the data is, the less values are available.
+The data is kept in several "archives", each holding the data for its set timeframe at a defined level of granularity.
+The starting point for all archives is the actually saved data sample (Item value).
+So while you might store a sample value every minute for the last 8 hours, you might store the average per day for the last year.
+
+This service cannot be directly queried, because of its data compression, which means that it cannot provide precise answers to all queries.
+
+NOTE: rrd4j is for storing numerical data only.
+It cannot store complex data types.
+
+## Persistence Process
+
+Round-robin databases (RRDs) have fixed length so called "archives" for storing values.
+Think of an archive as a "drawer" with a fixed number of "storage boxes" in it.
+
+The persistence service reads data "samples" from the openHAB core at regular intervals, and these are then put into the storage boxes.
+Either a) the samples are stored singly directly into a box, or b) multiple samples are consolidated (using a consolidation function) into a box.
+
+The service starts by storing samples in the leftmost box in the drawer.
+Once the leftmost box is full, the service starts filling the next box to the right; and so on.
+Once the rightmost box in the drawer is full, the leftmost box is emptied, the content of all boxes is moved one box to the left, and new content is added to the rightmost box.
+
+An example is shown below.
+Whereby the values indicated in the example may vary as chosen by the user..
+
+- Samples are taken at intervals of `60` seconds
+- They are consolidated by the `AVERAGE` function, over `10` samples, into boxes i.e. a box covers `10 X 60` seconds
+- The full archive contains `250` boxes i.e. the archive/drawer covers `60 X 10 X 250` seconds
+
+## Configuration
+
+Two things must be done in order for an Item to get persisted:
+
+1. it must have a persistence strategy defined in the `rrd4j.persist` file.
+2. it must have a `datasource` defined as follows..
+
+## Datasources
+
+The database comprises at least one datasource.
+The rrd4j service automatically creates one internal _**default**_ datasource for you.
+Other datasources **may** be configured in addition, in the `services/rrd4j.cfg` file.
+
+By default, if `services/rrd4j.cfg` does not exist, or if an Item is not explicitly listed in a `<dsName>.items` property value in it, then the respective Item will be persisted according to the [default datasource settings](#default-datasource).
+
+By constrast if an Item **is** explicitly listed in a `<dsName>.items` property value, then it will be persisted according to those respective datasource settings.
+
+Each datasource is defined by three property values (`def`, `archives`, `items`).
+Whereby each `archives` property can comprise settings for one or more archives.
+
+The various datasource property values are explained in the table below.
+
+| Property | Description |
+|---------------------|-------------|
+| `<dsName>`.def | Definition of the range of sample values to be taken, and when. The format is `<dsType>,<heartBeat>,<minValue>,<maxValue>,<sampleInterval>` |
+| `<dsName>`.archives | List of archives to be created. Each archive defines which subset of data samples shall be archived, and for how long. Consists of one or more archive entries separated by a ":" character. The format for one archive entry is `<consolidationFunction>,<xff>,<samplesPerBox>,<boxCount>` |
+| `<dsName>`.items | List of Items whose values shall be sampled and stored in the archive. The format is `Item1,Item2` _**Note: the same Item is not allowed to be listed in more than one datasource!**_ |
+
+For example..
+
+```
+ctr24h.def=COUNTER,900,0,U,60
+ctr24h.archives=AVERAGE,0.5,1,480:AVERAGE,0.5,10,144
+ctr24h.items=Item1,Item2
+```
+
+The description of the various datasource property elements is as follows:
+
+### `<dsName>` (Datasource Name)
+
+The name of the datasource.
+It must be an alphanumeric string.
+
+### `<dsType>` (Datasource Type)
+
+Defines the type of data to be stored.
+It must be one of the following string values:
+
+- **COUNTER** represents a ever-incrementing value (historically this was used for packet counters or traffic counters on network interfaces, a typical home-automation application would be your electricity meter). If you store the values of this counter in a simple database and make a chart of that, you'll most likely see a nearly flat line, because the increments per time are small compared to the absolute value (e.g. your electricity meter reads 60567 kWh, and you add 0.5 kWh per hour, than your chart over the whole day will show 60567 at the start and 60579 at the end of your chart. That is nearly invisible. RRD4J helps you out and will display the difference from one stored value to the other (depending on the selected size). Please note that the persistence extensions will return difference instead of the actual values if you use this type; this especially leads to wrong values if you try to restoreOnStartup!
+- **GAUGE** represents the reading of e.g. a temperature sensor. You'll see only small deviation over the day and your values will be within a small range, clearly visible within a chart.
+- **ABSOLUTE** is like a counter, but RRD4J assumes that the counter is reset when the value is read. So these are basically the delta values between the reads.
+- **DERIVE** is like a counter, but it can also decrease and therefore have a negative delta.
+
+### `<heartBeat>` (Heart Beat)
+
+The heartbeat parameter helps the database to detect missing values.
+i.e. if no new value is stored after "heartBeat" seconds, the value is considered missing when charting.
+
+It must be a positive integer value.
+
+### `<minValue> / <maxValue>` (Minimum resp. Maximum Value)
+
+These parameters define the range of acceptable sample values for that datasource.
+They must be either:
+
+- A numeric value, or
+- The letter "U" (unlimited)
+
+### `<sampleInterval>` (Sample Interval)
+
+The time interval (seconds) between reading consecutive samples from the OpenHAB core.
+
+It must be a positive integer value.
+
+### `<consolidationFunction>` (Consolidation Function)
+
+Determines the type of data compression to be used when more than one sample is to be stored in a single "storage box".
+So if you use the `AVERAGE` function, and two samples of `20.0` and `21.0` are to be stored, then the value `20.5` would be stored in the box.
+
+It must be one of the following strings:
+
+- **AVERAGE** the average of all the samples is stored in the box
+- **MIN** the lowest sample is stored in the box
+- **MAX** the highest sample is stored in the box
+- **LAST** the last sample is stored in the box
+- **FIRST** the first sample is stored in the box
+- **TOTAL** the sum of all samples is stored in the box
+
+All archives of a datasource must use the same `<consolidationFunction>`.
+
+### `<xff>` (X-files Factor)
+
+Defines the maximum allowed proportion of data samples that are stored as NaN ("Not a Number") relative to the set number of `<samplesPerBox>`. In case this proportion is above the set value, NaN will be persisted instead of the consolidated value. Using 0.5 would require at least 50 percent of the data samples to hold a value other than NaN.
+
+It must be a value between 0 and 1.
+
+### `<samplesPerBox>` (Samples Per Box)
+
+The number of consecutive data samples that will be consolidated to create a single entry ("storage box") in the database.
+If `<samplesPerBox>` is greater than 1 then the samples will be consolidated into the "storage box" by means of the `<consolidationFunction>` described above.
+The time span covered by a single "storage box" is therefore (`<sampleInterval>` x `<samplesPerBox>`) seconds.
+
+It must be a positive integer value.
+
+### `<boxCount>` (Box Count)
+
+The number of "storage boxes" in the archive.
+The time span covered by a full archive is therefore (`<sampleInterval>` x `<samplesPerBox>` x `<boxCount>`) seconds.
+
+It must be a positive integer value.
+
+### Multiple Possible Archives
+
+As already said, each datasource can have one or more archives.
+The purpose of having several archives is that it allows a different granularity of data storage over different timespans.
+
+In the example below..
+
+```
+ctr24h.def=COUNTER,900,0,U,60
+ctr24h.archives=AVERAGE,0.5,1,480:AVERAGE,0.5,10,144
+ctr24h.items=Item1,Item2
+```
+
+The `ctr24.def` defines a datasource which is using a COUNTER, a `<hearBeat>` of 900 seconds, a `<minValue>` of 0, a `<maxValue>` of unlimited and a `<sampleInterval>` of 60 seconds.
+
+The first archive entry in the `ctr24.archives` parameter has `480` boxes each containing `1` sample (or to be exact the `AVERAGE` of `1` sample).
+So it covers `480 X 60` seconds of data (8 hours) at a granularity of one minute.
+As a general rule the first archive (and maybe the only one) should have `<samplesPerBox> = 1` so that each sample is stored in one box.
+
+And the second archive entry has `144` boxes each containing the `AVERAGE` of `10` samples.
+So it covers `144 X 10 X 60` seconds of data (24 hours) at a granularity of ten minutes.
+
+## Default Datasource
+
+The service always automatically creates an internal default datasource with the properties below.
+
+```
+defaultNumeric.def=GAUGE,60,U,U,60
+defaultNumeric.archives=AVERAGE,0.5,1,480:AVERAGE,0.5,4,360:AVERAGE,0.5,14,644:AVERAGE,0.5,60,720:AVERAGE,0.5,720,730:AVERAGE,0.5,10080,520
+```
+
+The default datasource type is GAUGE, the heartbeat is 60s, minimum and maximum values are unlimited, and the sample interval is 60s.
+
+The default archives are:
+
+| Archive | Boxes | Samples per Box | Period covered |
+|:---------:|:---------:|:--------:|:-------------:|
+| 1 | 480 | 1 | 8 hrs |
+| 2 | 360 | 4 | 24 hrs |
+| 3 | 644 | 14 | 6.26 days |
+| 4 | 720 | 60 | 30 days |
+| 5 | 730 | 720 | 365 days |
+| 6 | 520 | 10080 | 10 years |
+
+There is no `.items` parameter for the default datasource.
+Implicitly this means that any Item with an allocated strategy in the `rrd4j.persist` file will be persisted using the above-mentioned default settings -
+_**exception**:_ the Item is explicitly listed in the `.items` property value of a datasource in the `rrd4j.cfg` file.
+
+---
+
+## Examples
+
+### `rrd4j.cfg` file
+
+```
+ctr24h.def=COUNTER,900,0,U,60
+ctr24h.archives=AVERAGE,0.5,1,480:AVERAGE,0.5,10,144
+ctr24h.items=Item1,Item2
+ctr7d.def=COUNTER,900,0,U,60
+ctr7d.archives=AVERAGE,0.5,1,480:AVERAGE,0.5,10,144:AVERAGE,0.5,60,672
+ctr7d.items=Item3,Item4
+```
+
+### `rrd4j.persist` file:
+
+```java
+Strategies {
+ // for rrd charts, we need a cron strategy
+ everyMinute : "0 * * * * ?"
+}
+
+Items {
+ // persist items on every change and every minute
+ * : strategy = everyChange, everyMinute
+}
+```
+
+**IMPORTANT:**
+The strategy `everyMinute` (60 seconds) **must** be used, otherwise no data will be persisted (stored).
+Other strategies can be used too.
+
+---
+
+## Troubleshooting
+
+From time to time, you may find that if you change the Item type of a persisted data point, you may experience charting or other problems. To resolve this issue, remove the old `<item_name>`.rrd file in the `${openhab_home}/etc/rrd4j` folder or `/var/lib/openhab/persistence/rrd4j` folder for apt-get installed openHABs.
+
+Restoring Item values after startup takes some time. Rules may already have started to run in parallel. Especially in rules that are started via the "System started" trigger, it may happen that the restore has not yet completed resulting in non-defined Item values. In these cases the use of restored Item values should be delayed by a couple of seconds. This delay has to be determined experimentally.
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.rrd4j</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Persistence Service :: RRD4j</name>
+
+ <properties>
+ <bnd.importpackage>!com.mongodb.*,!io.netty.*,!com.bea.*,!io.reactivex.*,!org.reactivestreams.*,!de.erichseifert.*,!org.w3c.*,!org.jvnet.*,!com.ctc.*,!com.sun.*,!com.sleepycat.*,!dagger.*,!org.codehaus.*,!org.glassfish.*,!com.ibm.*,!javax.xml.*,!net.sf.*,!nu.xom.*,!org.bson.*,!org.dom4j.*,!org.jdom.*,!org.jdom2.*,!org.kxml2.io.*,!org.xmlpull.*,!sun.*</bnd.importpackage>
+ </properties>
+
+ <dependencies>
+ <!-- https://mvnrepository.com/artifact/org.rrd4j/rrd4j -->
+ <dependency>
+ <groupId>org.rrd4j</groupId>
+ <artifactId>rrd4j</artifactId>
+ <version>3.3.1</version>
+ </dependency>
+ </dependencies>
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.persistence.rrd4j-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-persistence-rrd4j" description="RRD4j Persistence" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.persistence.rrd4j/${project.version}</bundle>
+ <configfile finalname="${openhab.conf}/services/rrd4j.cfg" override="false">mvn:${project.groupId}/openhab-addons-external3/${project.version}/cfg/rrd4j</configfile>
+ </feature>
+
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.rrd4j.internal;
+
+import java.text.DateFormat;
+import java.time.ZonedDateTime;
+
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.types.State;
+
+/**
+ * This is a Java bean used to return historic items from a rrd4j database.
+ *
+ * @author Kai Kreuzer - Initial contribution
+ *
+ */
+public class RRD4jItem implements HistoricItem {
+
+ private final String name;
+ private final State state;
+ private final ZonedDateTime timestamp;
+
+ public RRD4jItem(String name, State state, ZonedDateTime timestamp) {
+ this.name = name;
+ this.state = state;
+ this.timestamp = timestamp;
+ }
+
+ @Override
+ public String getName() {
+ return name;
+ }
+
+ @Override
+ public State getState() {
+ return state;
+ }
+
+ @Override
+ public ZonedDateTime getTimestamp() {
+ return timestamp;
+ }
+
+ @Override
+ public String toString() {
+ return DateFormat.getDateTimeInstance().format(timestamp) + ": " + name + " -> " + state.toString();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.rrd4j.internal;
+
+import java.io.File;
+import java.io.IOException;
+import java.time.Instant;
+import java.time.ZoneId;
+import java.time.ZonedDateTime;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.concurrent.Executors;
+import java.util.concurrent.RejectedExecutionException;
+import java.util.concurrent.ScheduledExecutorService;
+import java.util.concurrent.ScheduledFuture;
+import java.util.concurrent.TimeUnit;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.OpenHAB;
+import org.openhab.core.common.NamedThreadFactory;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.library.items.ContactItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.library.items.RollershutterItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.DecimalType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.OpenClosedType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.FilterCriteria.Ordering;
+import org.openhab.core.persistence.HistoricItem;
+import org.openhab.core.persistence.PersistenceItemInfo;
+import org.openhab.core.persistence.PersistenceService;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.persistence.strategy.PersistenceCronStrategy;
+import org.openhab.core.persistence.strategy.PersistenceStrategy;
+import org.openhab.core.types.State;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Reference;
+import org.rrd4j.ConsolFun;
+import org.rrd4j.DsType;
+import org.rrd4j.core.FetchData;
+import org.rrd4j.core.FetchRequest;
+import org.rrd4j.core.RrdDb;
+import org.rrd4j.core.RrdDef;
+import org.rrd4j.core.Sample;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This is the implementation of the RRD4j {@link PersistenceService}. To learn
+ * more about RRD4j please visit their
+ * <a href="https://github.com/rrd4j/rrd4j">website</a>.
+ *
+ * @author Kai Kreuzer - Initial contribution
+ * @author Jan N. Klug - some improvements
+ * @author Karel Goderis - remove TimerThread dependency
+ */
+@NonNullByDefault
+@Component(service = { PersistenceService.class,
+ QueryablePersistenceService.class }, configurationPid = "org.openhab.rrd4j")
+public class RRD4jPersistenceService implements QueryablePersistenceService {
+
+ private static final String DEFAULT_OTHER = "default_other";
+ private static final String DEFAULT_NUMERIC = "default_numeric";
+ private static final String DEFAULT_QUANTIFIABLE = "default_quantifiable";
+
+ private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(3,
+ new NamedThreadFactory("RRD4j"));
+
+ private final Map<String, @Nullable RrdDefConfig> rrdDefs = new ConcurrentHashMap<>();
+
+ private static final String DATASOURCE_STATE = "state";
+
+ public static final String DB_FOLDER = getUserPersistenceDataFolder() + File.separator + "rrd4j";
+
+ private final Logger logger = LoggerFactory.getLogger(RRD4jPersistenceService.class);
+
+ private final Map<String, @Nullable ScheduledFuture<?>> scheduledJobs = new HashMap<>();
+
+ protected final ItemRegistry itemRegistry;
+
+ @Activate
+ public RRD4jPersistenceService(final @Reference ItemRegistry itemRegistry) {
+ this.itemRegistry = itemRegistry;
+ }
+
+ @Override
+ public String getId() {
+ return "rrd4j";
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return "RRD4j";
+ }
+
+ @Override
+ public synchronized void store(final Item item, @Nullable final String alias) {
+ final String name = alias == null ? item.getName() : alias;
+ RrdDb db = getDB(name);
+ if (db != null) {
+ ConsolFun function = getConsolidationFunction(db);
+ long now = System.currentTimeMillis() / 1000;
+ if (function != ConsolFun.AVERAGE) {
+ try {
+ // we store the last value again, so that the value change
+ // in the database is not interpolated, but
+ // happens right at this spot
+ if (now - 1 > db.getLastUpdateTime()) {
+ // only do it if there is not already a value
+ double lastValue = db.getLastDatasourceValue(DATASOURCE_STATE);
+ if (!Double.isNaN(lastValue)) {
+ Sample sample = db.createSample();
+ sample.setTime(now - 1);
+ sample.setValue(DATASOURCE_STATE, lastValue);
+ sample.update();
+ logger.debug("Stored '{}' with state '{}' in rrd4j database (again)", name,
+ mapToState(lastValue, item.getName()));
+ }
+ }
+ } catch (IOException e) {
+ logger.debug("Error storing last value (again): {}", e.getMessage());
+ }
+ }
+ try {
+ Sample sample = db.createSample();
+ sample.setTime(now);
+
+ DecimalType state = item.getStateAs(DecimalType.class);
+ if (state != null) {
+ double value = state.toBigDecimal().doubleValue();
+ if (db.getDatasource(DATASOURCE_STATE).getType() == DsType.COUNTER) { // counter values must be
+ // adjusted by stepsize
+ value = value * db.getRrdDef().getStep();
+ }
+ sample.setValue(DATASOURCE_STATE, value);
+ sample.update();
+ logger.debug("Stored '{}' with state '{}' in rrd4j database", name, state);
+ }
+ } catch (IllegalArgumentException e) {
+ if (e.getMessage().contains("at least one second step is required")) {
+ // we try to store the value one second later
+ ScheduledFuture<?> job = scheduledJobs.get(name);
+ if (job != null) {
+ job.cancel(true);
+ scheduledJobs.remove(name);
+ }
+ job = scheduler.schedule(() -> store(item, name), 1, TimeUnit.SECONDS);
+ scheduledJobs.put(name, job);
+ } else {
+ logger.warn("Could not persist '{}' to rrd4j database: {}", name, e.getMessage());
+ }
+ } catch (Exception e) {
+ logger.warn("Could not persist '{}' to rrd4j database: {}", name, e.getMessage());
+ }
+ try {
+ db.close();
+ } catch (IOException e) {
+ logger.debug("Error closing rrd4j database: {}", e.getMessage());
+ }
+ }
+ }
+
+ @Override
+ public void store(Item item) {
+ store(item, null);
+ }
+
+ @Override
+ public Iterable<HistoricItem> query(FilterCriteria filter) {
+ String itemName = filter.getItemName();
+ RrdDb db = getDB(itemName);
+ if (db != null) {
+ ConsolFun consolidationFunction = getConsolidationFunction(db);
+ long start = 0L;
+ long end = filter.getEndDate() == null ? System.currentTimeMillis() / 1000
+ : filter.getEndDate().toInstant().getEpochSecond();
+
+ try {
+ if (filter.getBeginDate() == null) {
+ // as rrd goes back for years and gets more and more
+ // inaccurate, we only support descending order
+ // and a single return value
+ // if there is no begin date is given - this case is
+ // required specifically for the historicState()
+ // query, which we want to support
+ if (filter.getOrdering() == Ordering.DESCENDING && filter.getPageSize() == 1
+ && filter.getPageNumber() == 0) {
+ if (filter.getEndDate() == null) {
+ // we are asked only for the most recent value!
+ double lastValue = db.getLastDatasourceValue(DATASOURCE_STATE);
+ if (!Double.isNaN(lastValue)) {
+ HistoricItem rrd4jItem = new RRD4jItem(itemName, mapToState(lastValue, itemName),
+ ZonedDateTime.ofInstant(
+ Instant.ofEpochMilli(db.getLastArchiveUpdateTime() * 1000),
+ ZoneId.systemDefault()));
+ return Collections.singletonList(rrd4jItem);
+ } else {
+ return Collections.emptyList();
+ }
+ } else {
+ start = end;
+ }
+ } else {
+ throw new UnsupportedOperationException("rrd4j does not allow querys without a begin date, "
+ + "unless order is descending and a single value is requested");
+ }
+ } else {
+ start = filter.getBeginDate().toInstant().getEpochSecond();
+ }
+ FetchRequest request = db.createFetchRequest(consolidationFunction, start, end, 1);
+
+ List<HistoricItem> items = new ArrayList<>();
+ FetchData result = request.fetchData();
+ long ts = result.getFirstTimestamp();
+ long step = result.getRowCount() > 1 ? result.getStep() : 0;
+ for (double value : result.getValues(DATASOURCE_STATE)) {
+ if (!Double.isNaN(value) && (((ts >= start) && (ts <= end)) || (start == end))) {
+ RRD4jItem rrd4jItem = new RRD4jItem(itemName, mapToState(value, itemName),
+ ZonedDateTime.ofInstant(Instant.ofEpochMilli(ts * 1000), ZoneId.systemDefault()));
+ items.add(rrd4jItem);
+ }
+ ts += step;
+ }
+ return items;
+ } catch (IOException e) {
+ logger.warn("Could not query rrd4j database for item '{}': {}", itemName, e.getMessage());
+ }
+ }
+ return Collections.emptyList();
+ }
+
+ @Override
+ public Set<PersistenceItemInfo> getItemInfo() {
+ return Collections.emptySet();
+ }
+
+ protected @Nullable synchronized RrdDb getDB(String alias) {
+ RrdDb db = null;
+ File file = new File(DB_FOLDER + File.separator + alias + ".rrd");
+ try {
+ if (file.exists()) {
+ // recreate the RrdDb instance from the file
+ db = new RrdDb(file.getAbsolutePath());
+ } else {
+ File folder = new File(DB_FOLDER);
+ if (!folder.exists()) {
+ folder.mkdirs();
+ }
+ // create a new database file
+ db = new RrdDb(getRrdDef(alias, file));
+ }
+ } catch (IOException e) {
+ logger.error("Could not create rrd4j database file '{}': {}", file.getAbsolutePath(), e.getMessage());
+ } catch (RejectedExecutionException e) {
+ // this happens if the system is shut down
+ logger.debug("Could not create rrd4j database file '{}': {}", file.getAbsolutePath(), e.getMessage());
+ }
+ return db;
+ }
+
+ private @Nullable RrdDefConfig getRrdDefConfig(String itemName) {
+ RrdDefConfig useRdc = null;
+ for (Map.Entry<String, @Nullable RrdDefConfig> e : rrdDefs.entrySet()) {
+ // try to find special config
+ RrdDefConfig rdc = e.getValue();
+ if (rdc != null && rdc.appliesTo(itemName)) {
+ useRdc = rdc;
+ break;
+ }
+ }
+ if (useRdc == null) { // not defined, use defaults
+ try {
+ Item item = itemRegistry.getItem(itemName);
+ if (item instanceof NumberItem) {
+ NumberItem numberItem = (NumberItem) item;
+ return numberItem.getDimension() != null ? rrdDefs.get(DEFAULT_QUANTIFIABLE)
+ : rrdDefs.get(DEFAULT_NUMERIC);
+ }
+ } catch (ItemNotFoundException e) {
+ logger.debug("Could not find item '{}' in registry", itemName);
+ }
+ }
+ return rrdDefs.get(DEFAULT_OTHER);
+ }
+
+ private RrdDef getRrdDef(String itemName, File file) {
+ RrdDef rrdDef = new RrdDef(file.getAbsolutePath());
+ RrdDefConfig useRdc = getRrdDefConfig(itemName);
+ if (useRdc != null) {
+ rrdDef.setStep(useRdc.step);
+ rrdDef.setStartTime(System.currentTimeMillis() / 1000 - 1);
+ rrdDef.addDatasource(DATASOURCE_STATE, useRdc.dsType, useRdc.heartbeat, useRdc.min, useRdc.max);
+ for (RrdArchiveDef rad : useRdc.archives) {
+ rrdDef.addArchive(rad.fcn, rad.xff, rad.steps, rad.rows);
+ }
+ }
+ return rrdDef;
+ }
+
+ public ConsolFun getConsolidationFunction(RrdDb db) {
+ try {
+ return db.getRrdDef().getArcDefs()[0].getConsolFun();
+ } catch (IOException e) {
+ return ConsolFun.MAX;
+ }
+ }
+
+ private State mapToState(double value, String itemName) {
+ try {
+ Item item = itemRegistry.getItem(itemName);
+ if (item instanceof SwitchItem && !(item instanceof DimmerItem)) {
+ return value == 0.0d ? OnOffType.OFF : OnOffType.ON;
+ } else if (item instanceof ContactItem) {
+ return value == 0.0d ? OpenClosedType.CLOSED : OpenClosedType.OPEN;
+ } else if (item instanceof DimmerItem || item instanceof RollershutterItem) {
+ // make sure Items that need PercentTypes instead of DecimalTypes do receive the right information
+ return new PercentType((int) Math.round(value * 100));
+ }
+ } catch (ItemNotFoundException e) {
+ logger.debug("Could not find item '{}' in registry", itemName);
+ }
+ // just return a DecimalType as a fallback
+ return new DecimalType(value);
+ }
+
+ private static String getUserPersistenceDataFolder() {
+ return OpenHAB.getUserDataFolder() + File.separator + "persistence";
+ }
+
+ /**
+ * @{inheritDoc
+ */
+ public void activate(final Map<String, Object> config) {
+ // add default configurations
+
+ RrdDefConfig defaultNumeric = new RrdDefConfig(DEFAULT_NUMERIC);
+ // use 10 seconds as a step size for numeric values and allow a 10 minute silence between updates
+ defaultNumeric.setDef("GAUGE,600,U,U,10");
+ // define 5 different boxes:
+ // 1. granularity of 10s for the last hour
+ // 2. granularity of 1m for the last week
+ // 3. granularity of 15m for the last year
+ // 4. granularity of 1h for the last 5 years
+ // 5. granularity of 1d for the last 10 years
+ defaultNumeric.addArchives("LAST,0.5,1,360:LAST,0.5,6,10080:LAST,0.5,90,36500:LAST,0.5,8640,3650");
+ rrdDefs.put(DEFAULT_NUMERIC, defaultNumeric);
+
+ RrdDefConfig defaultQuantifiable = new RrdDefConfig(DEFAULT_QUANTIFIABLE);
+ // use 10 seconds as a step size for numeric values and allow a 10 minute silence between updates
+ defaultQuantifiable.setDef("GAUGE,600,U,U,10");
+ // define 5 different boxes:
+ // 1. granularity of 10s for the last hour
+ // 2. granularity of 1m for the last week
+ // 3. granularity of 15m for the last year
+ // 4. granularity of 1h for the last 5 years
+ // 5. granularity of 1d for the last 10 years
+ defaultQuantifiable
+ .addArchives("AVERAGE,0.5,1,360:AVERAGE,0.5,6,10080:LAST,0.5,90,36500:AVERAGE,0.5,8640,3650");
+ rrdDefs.put(DEFAULT_QUANTIFIABLE, defaultQuantifiable);
+
+ RrdDefConfig defaultOther = new RrdDefConfig(DEFAULT_OTHER);
+ // use 5 seconds as a step size for discrete values and allow a 1h silence between updates
+ defaultOther.setDef("GAUGE,3600,U,U,5");
+ // define 4 different boxes:
+ // 1. granularity of 5s for the last hour
+ // 2. granularity of 1m for the last week
+ // 3. granularity of 15m for the last year
+ // 4. granularity of 4h for the last 10 years
+ defaultOther.addArchives("LAST,0.5,1,1440:LAST,0.5,12,10080:LAST,0.5,180,35040:LAST,0.5,240,21900");
+ rrdDefs.put(DEFAULT_OTHER, defaultOther);
+
+ if (config.isEmpty()) {
+ logger.debug("using default configuration only");
+ return;
+ }
+
+ Iterator<String> keys = config.keySet().iterator();
+ while (keys.hasNext()) {
+ String key = keys.next();
+
+ if (key.equals("service.pid") || key.equals("component.name")) {
+ // ignore service.pid and name
+ continue;
+ }
+
+ String[] subkeys = key.split("\\.");
+ if (subkeys.length != 2) {
+ logger.debug("config '{}' should have the format 'name.configkey'", key);
+ continue;
+ }
+
+ Object v = config.get(key);
+ if (v instanceof String) {
+ String value = (String) v;
+ String name = subkeys[0].toLowerCase();
+ String property = subkeys[1].toLowerCase();
+
+ if (value.isBlank()) {
+ logger.trace("Config is empty: {}", property);
+ continue;
+ } else {
+ logger.trace("Processing config: {} = {}", property, value);
+ }
+
+ RrdDefConfig rrdDef = rrdDefs.get(name);
+ if (rrdDef == null) {
+ rrdDef = new RrdDefConfig(name);
+ rrdDefs.put(name, rrdDef);
+ }
+
+ try {
+ if (property.equals("def")) {
+ rrdDef.setDef(value);
+ } else if (property.equals("archives")) {
+ rrdDef.addArchives(value);
+ } else if (property.equals("items")) {
+ rrdDef.addItems(value);
+ } else {
+ logger.debug("Unknown property {} : {}", property, value);
+ }
+ } catch (IllegalArgumentException e) {
+ logger.warn("Ignoring illegal configuration: {}", e.getMessage());
+ }
+ }
+ }
+
+ for (RrdDefConfig rrdDef : rrdDefs.values()) {
+ if (rrdDef != null) {
+ if (rrdDef.isValid()) {
+ logger.debug("Created {}", rrdDef);
+ } else {
+ logger.info("Removing invalid definition {}", rrdDef);
+ rrdDefs.remove(rrdDef.name);
+ }
+ }
+ }
+ }
+
+ private class RrdArchiveDef {
+ public @Nullable ConsolFun fcn;
+ public double xff;
+ public int steps, rows;
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder(" " + fcn);
+ sb.append(" xff = ").append(xff);
+ sb.append(" steps = ").append(steps);
+ sb.append(" rows = ").append(rows);
+ return sb.toString();
+ }
+ }
+
+ private class RrdDefConfig {
+ public String name;
+ public @Nullable DsType dsType;
+ public int heartbeat, step;
+ public double min, max;
+ public List<RrdArchiveDef> archives;
+ public List<String> itemNames;
+
+ private boolean isInitialized;
+
+ public RrdDefConfig(String name) {
+ this.name = name;
+ archives = new ArrayList<>();
+ itemNames = new ArrayList<>();
+ isInitialized = false;
+ }
+
+ public void setDef(String defString) {
+ String[] opts = defString.split(",");
+ if (opts.length != 5) { // check if correct number of parameters
+ logger.warn("invalid number of parameters {}: {}", name, defString);
+ return;
+ }
+
+ if (opts[0].equals("ABSOLUTE")) { // dsType
+ dsType = DsType.ABSOLUTE;
+ } else if (opts[0].equals("COUNTER")) {
+ dsType = DsType.COUNTER;
+ } else if (opts[0].equals("DERIVE")) {
+ dsType = DsType.DERIVE;
+ } else if (opts[0].equals("GAUGE")) {
+ dsType = DsType.GAUGE;
+ } else {
+ logger.warn("{}: dsType {} not supported", name, opts[0]);
+ }
+
+ heartbeat = Integer.parseInt(opts[1]);
+
+ if (opts[2].equals("U")) {
+ min = Double.NaN;
+ } else {
+ min = Double.parseDouble(opts[2]);
+ }
+
+ if (opts[3].equals("U")) {
+ max = Double.NaN;
+ } else {
+ max = Double.parseDouble(opts[3]);
+ }
+
+ step = Integer.parseInt(opts[4]);
+
+ isInitialized = true; // successfully initialized
+
+ return;
+ }
+
+ public void addArchives(String archivesString) {
+ String splitArchives[] = archivesString.split(":");
+ for (String archiveString : splitArchives) {
+ String[] opts = archiveString.split(",");
+ if (opts.length != 4) { // check if correct number of parameters
+ logger.warn("invalid number of parameters {}: {}", name, archiveString);
+ return;
+ }
+ RrdArchiveDef arc = new RrdArchiveDef();
+
+ if (opts[0].equals("AVERAGE")) {
+ arc.fcn = ConsolFun.AVERAGE;
+ } else if (opts[0].equals("MIN")) {
+ arc.fcn = ConsolFun.MIN;
+ } else if (opts[0].equals("MAX")) {
+ arc.fcn = ConsolFun.MAX;
+ } else if (opts[0].equals("LAST")) {
+ arc.fcn = ConsolFun.LAST;
+ } else if (opts[0].equals("FIRST")) {
+ arc.fcn = ConsolFun.FIRST;
+ } else if (opts[0].equals("TOTAL")) {
+ arc.fcn = ConsolFun.TOTAL;
+ } else {
+ logger.warn("{}: consolidation function {} not supported", name, opts[0]);
+ }
+ arc.xff = Double.parseDouble(opts[1]);
+ arc.steps = Integer.parseInt(opts[2]);
+ arc.rows = Integer.parseInt(opts[3]);
+ archives.add(arc);
+ }
+ }
+
+ public void addItems(String itemsString) {
+ String splitItems[] = itemsString.split(",");
+ for (String item : splitItems) {
+ itemNames.add(item);
+ }
+ }
+
+ public boolean appliesTo(String item) {
+ return itemNames.contains(item);
+ }
+
+ public boolean isValid() { // a valid configuration must be initialized
+ // and contain at least one function
+ return (isInitialized && (archives.size() > 0));
+ }
+
+ @Override
+ public String toString() {
+ StringBuilder sb = new StringBuilder(name);
+ sb.append(" = ").append(dsType);
+ sb.append(" heartbeat = ").append(heartbeat);
+ sb.append(" min/max = ").append(min).append("/").append(max);
+ sb.append(" step = ").append(step);
+ sb.append(" ").append(archives.size()).append(" archives(s) = [");
+ for (RrdArchiveDef arc : archives) {
+ sb.append(arc.toString());
+ }
+ sb.append("] ");
+ sb.append(itemNames.size()).append(" items(s) = [");
+ for (String item : itemNames) {
+ sb.append(item).append(" ");
+ }
+ sb.append("]");
+ return sb.toString();
+ }
+ }
+
+ @Override
+ public List<PersistenceStrategy> getDefaultStrategies() {
+ return List.of(PersistenceStrategy.Globals.RESTORE, PersistenceStrategy.Globals.CHANGE,
+ new PersistenceCronStrategy("everyMinute", "0 * * * * ?"));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.rrd4j.internal.charts;
+
+import java.awt.Color;
+import java.awt.Font;
+import java.awt.image.BufferedImage;
+import java.io.File;
+import java.io.IOException;
+import java.util.Date;
+import java.util.HashMap;
+import java.util.Hashtable;
+import java.util.Map;
+
+import javax.imageio.ImageIO;
+import javax.servlet.Servlet;
+import javax.servlet.ServletConfig;
+import javax.servlet.ServletException;
+import javax.servlet.ServletRequest;
+import javax.servlet.ServletResponse;
+
+import org.openhab.core.items.GroupItem;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.library.items.NumberItem;
+import org.openhab.core.ui.chart.ChartProvider;
+import org.openhab.core.ui.items.ItemUIRegistry;
+import org.openhab.persistence.rrd4j.internal.RRD4jPersistenceService;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Deactivate;
+import org.osgi.service.component.annotations.Reference;
+import org.osgi.service.http.HttpService;
+import org.osgi.service.http.NamespaceException;
+import org.rrd4j.ConsolFun;
+import org.rrd4j.core.RrdDb;
+import org.rrd4j.graph.RrdGraph;
+import org.rrd4j.graph.RrdGraphDef;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * This servlet generates time-series charts for a given set of items.
+ * It accepts the following HTTP parameters:
+ * <ul>
+ * <li>w: width in pixels of image to generate</li>
+ * <li>h: height in pixels of image to generate</li>
+ * <li>period: the time span for the x-axis. Value can be h,4h,8h,12h,D,3D,W,2W,M,2M,4M,Y</li>
+ * <li>items: A comma separated list of item names to display
+ * <li>groups: A comma separated list of group names, whose members should be displayed
+ * </ul>
+ *
+ * @author Kai Kreuzer - Initial contribution
+ * @author Chris Jackson - a few improvements
+ * @author Jan N. Klug - a few improvements
+ *
+ */
+@Component(service = ChartProvider.class)
+public class RRD4jChartServlet implements Servlet, ChartProvider {
+
+ private final Logger logger = LoggerFactory.getLogger(RRD4jChartServlet.class);
+
+ /** the URI of this servlet */
+ public static final String SERVLET_NAME = "/rrdchart.png";
+
+ protected static final Color[] LINECOLORS = new Color[] { Color.RED, Color.GREEN, Color.BLUE, Color.MAGENTA,
+ Color.ORANGE, Color.CYAN, Color.PINK, Color.DARK_GRAY, Color.YELLOW };
+ protected static final Color[] AREACOLORS = new Color[] { new Color(255, 0, 0, 30), new Color(0, 255, 0, 30),
+ new Color(0, 0, 255, 30), new Color(255, 0, 255, 30), new Color(255, 128, 0, 30),
+ new Color(0, 255, 255, 30), new Color(255, 0, 128, 30), new Color(255, 128, 128, 30),
+ new Color(255, 255, 0, 30) };
+
+ protected static final Map<String, Long> PERIODS = new HashMap<>();
+
+ static {
+ PERIODS.put("h", -3600000L);
+ PERIODS.put("4h", -14400000L);
+ PERIODS.put("8h", -28800000L);
+ PERIODS.put("12h", -43200000L);
+ PERIODS.put("D", -86400000L);
+ PERIODS.put("3D", -259200000L);
+ PERIODS.put("W", -604800000L);
+ PERIODS.put("2W", -1209600000L);
+ PERIODS.put("M", -2592000000L);
+ PERIODS.put("2M", -5184000000L);
+ PERIODS.put("4M", -10368000000L);
+ PERIODS.put("Y", -31536000000L);
+ }
+
+ @Reference
+ protected HttpService httpService;
+
+ @Reference
+ protected ItemUIRegistry itemUIRegistry;
+
+ @Activate
+ protected void activate() {
+ try {
+ logger.debug("Starting up rrd chart servlet at {}", SERVLET_NAME);
+ httpService.registerServlet(SERVLET_NAME, this, new Hashtable<>(), httpService.createDefaultHttpContext());
+ } catch (NamespaceException e) {
+ logger.error("Error during servlet startup", e);
+ } catch (ServletException e) {
+ logger.error("Error during servlet startup", e);
+ }
+ }
+
+ @Deactivate
+ protected void deactivate() {
+ httpService.unregister(SERVLET_NAME);
+ }
+
+ @Override
+ public void service(ServletRequest req, ServletResponse res) throws ServletException, IOException {
+ logger.debug("RRD4J received incoming chart request: {}", req);
+
+ int width = 480;
+ try {
+ width = Integer.parseInt(req.getParameter("w"));
+ } catch (Exception e) {
+ }
+ int height = 240;
+ try {
+ height = Integer.parseInt(req.getParameter("h"));
+ } catch (Exception e) {
+ }
+ Long period = PERIODS.get(req.getParameter("period"));
+ if (period == null) {
+ // use a day as the default period
+ period = PERIODS.get("D");
+ }
+ // Create the start and stop time
+ Date timeEnd = new Date();
+ Date timeBegin = new Date(timeEnd.getTime() + period);
+
+ // Set the content type to that provided by the chart provider
+ res.setContentType("image/" + getChartType());
+ try {
+ BufferedImage chart = createChart(null, null, timeBegin, timeEnd, height, width, req.getParameter("items"),
+ req.getParameter("groups"), null, null);
+ ImageIO.write(chart, getChartType().toString(), res.getOutputStream());
+ } catch (ItemNotFoundException e) {
+ logger.debug("Item not found error while generating chart.");
+ } catch (IllegalArgumentException e) {
+ logger.debug("Illegal argument in chart", e);
+ }
+ }
+
+ /**
+ * Adds a line for the item to the graph definition.
+ * The color of the line is determined by the counter, it simply picks the according index from LINECOLORS (and
+ * rolls over if necessary).
+ *
+ * @param graphDef the graph definition to fill
+ * @param item the item to add a line for
+ * @param counter defines the number of the datasource and is used to determine the line color
+ */
+ protected void addLine(RrdGraphDef graphDef, Item item, int counter) {
+ Color color = LINECOLORS[counter % LINECOLORS.length];
+ String label = itemUIRegistry.getLabel(item.getName());
+ String rrdName = RRD4jPersistenceService.DB_FOLDER + File.separator + item.getName() + ".rrd";
+ ConsolFun consolFun;
+ if (label != null && label.contains("[") && label.contains("]")) {
+ label = label.substring(0, label.indexOf('['));
+ }
+ try {
+ RrdDb db = new RrdDb(rrdName);
+ consolFun = db.getRrdDef().getArcDefs()[0].getConsolFun();
+ db.close();
+ } catch (IOException e) {
+ consolFun = ConsolFun.MAX;
+ }
+ if (item instanceof NumberItem) {
+ // we only draw a line
+ graphDef.datasource(Integer.toString(counter), rrdName, "state", consolFun); // RRD4jService.getConsolidationFunction(item));
+ graphDef.line(Integer.toString(counter), color, label, 2);
+ } else {
+ // we draw a line and fill the area beneath it with a transparent color
+ graphDef.datasource(Integer.toString(counter), rrdName, "state", consolFun); // RRD4jService.getConsolidationFunction(item));
+ Color areaColor = AREACOLORS[counter % LINECOLORS.length];
+
+ graphDef.area(Integer.toString(counter), areaColor);
+ graphDef.line(Integer.toString(counter), color, label, 2);
+ }
+ }
+
+ @Override
+ public void init(ServletConfig config) throws ServletException {
+ }
+
+ @Override
+ public ServletConfig getServletConfig() {
+ return null;
+ }
+
+ @Override
+ public String getServletInfo() {
+ return null;
+ }
+
+ @Override
+ public void destroy() {
+ }
+
+ // ----------------------------------------------------------
+ // The following methods implement the ChartServlet interface
+
+ @Override
+ public String getName() {
+ return "rrd4j";
+ }
+
+ @Override
+ public BufferedImage createChart(String service, String theme, Date startTime, Date endTime, int height, int width,
+ String items, String groups, Integer dpi, Boolean legend) throws ItemNotFoundException {
+ RrdGraphDef graphDef = new RrdGraphDef();
+
+ long period = (startTime.getTime() - endTime.getTime()) / 1000;
+
+ graphDef.setWidth(width);
+ graphDef.setHeight(height);
+ graphDef.setAntiAliasing(true);
+ graphDef.setImageFormat("PNG");
+ graphDef.setStartTime(period);
+ graphDef.setTextAntiAliasing(true);
+ graphDef.setLargeFont(new Font("SansSerif", Font.PLAIN, 15));
+ graphDef.setSmallFont(new Font("SansSerif", Font.PLAIN, 11));
+
+ int seriesCounter = 0;
+
+ // Loop through all the items
+ if (items != null) {
+ String[] itemNames = items.split(",");
+ for (String itemName : itemNames) {
+ Item item = itemUIRegistry.getItem(itemName);
+ addLine(graphDef, item, seriesCounter++);
+ }
+ }
+
+ // Loop through all the groups and add each item from each group
+ if (groups != null) {
+ String[] groupNames = groups.split(",");
+ for (String groupName : groupNames) {
+ Item item = itemUIRegistry.getItem(groupName);
+ if (item instanceof GroupItem) {
+ GroupItem groupItem = (GroupItem) item;
+ for (Item member : groupItem.getMembers()) {
+ addLine(graphDef, member, seriesCounter++);
+ }
+ } else {
+ throw new ItemNotFoundException("Item '" + item.getName() + "' defined in groups is not a group.");
+ }
+ }
+ }
+
+ // Write the chart as a PNG image
+ RrdGraph graph;
+ try {
+ graph = new RrdGraph(graphDef);
+ BufferedImage bi = new BufferedImage(graph.getRrdGraphInfo().getWidth(),
+ graph.getRrdGraphInfo().getHeight(), BufferedImage.TYPE_INT_RGB);
+ graph.render(bi.getGraphics());
+
+ return bi;
+ } catch (IOException e) {
+ logger.error("Error generating graph.", e);
+ }
+
+ return null;
+ }
+
+ @Override
+ public ImageType getChartType() {
+ return ImageType.png;
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" path="target/generated-sources/annotations">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="target/generated-test-sources/test-annotations">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.voice.googletts</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# Google Cloud Text-to-Speech
+
+Google Cloud TTS Service uses the non-free Google Cloud Text-to-Speech API to convert text or Speech Synthesis Markup Language (SSML) input into audio data of natural human speech.
+It provides multiple voices, available in different languages and variants and applies DeepMind’s groundbreaking research in WaveNet and Google’s powerful neural networks.
+The implementation caches the converted texts to reduce the load on the API and make the conversion faster.
+You can find them in the `$OPENHAB_USERDATA/cache/org.openhab.voice.googletts` folder.
+Be aware, that using this service may incur cost on your Google Cloud account.
+You can find pricing information on the [documentation page](https://cloud.google.com/text-to-speech/#pricing-summary).
+
+## Table of Contents
+
+<!-- MarkdownTOC -->
+
+* [Obtaining Credentials](#obtaining-credentials)
+* [Service Configuration](#service-configuration)
+* [Voice Configuration](#voice-configuration)
+
+<!-- /MarkdownTOC -->
+
+## Obtaining Credentials
+
+Before you can integrate this service with your Google Cloud Text-to-Speech, you must have a Google API Console project:
+
+* Select or create a GCP project. [link](https://console.cloud.google.com/cloud-resource-manager)
+* Make sure that billing is enabled for your project. [link](https://cloud.google.com/billing/docs/how-to/modify-project)
+* Enable the Cloud Text-to-Speech API. [link](https://console.cloud.google.com/apis/dashboard)
+* Set up authentication:
+ * Go to the "APIs & Services" -> "Credentials" page in the GCP Console and your project. [link](https://console.cloud.google.com/apis/credentials)
+ * From the "Create credentials" drop-down list, select "OAuth client ID.
+ * Select application type "Other" and enter a name into the "Name" field.
+ * Click Create. A pop-up appears, showing your "client ID" and "client secret".
+
+## Service Configuration
+
+Using your favorite configuration UI (e.g. Paper UI) edit **Services / Voice / Google Cloud Text-to-Speech** settings and set:
+
+* **Client Id** - Google Cloud Platform OAuth 2.0-Client Id.
+* **Client Secret** - Google Cloud Platform OAuth 2.0-Client Secret.
+* **Authorization Code** - The auth-code is a one-time code needed to retrieve the necessary access-codes from Google Cloud Platform.
+**Please go to your browser ...**
+[https://accounts.google.com/o/oauth2/auth?client_id={{clientId}}&redirect_uri=urn:ietf:wg:oauth:2.0:oob&scope=https://www.googleapis.com/auth/cloud-platform&response_type=code](https://accounts.google.com/o/oauth2/auth?client_id={{clientId}}&redirect_uri=urn:ietf:wg:oauth:2.0:oob&scope=https://www.googleapis.com/auth/cloud-platform&response_type=code) (replace `{{clientId}}` by your Client Id)
+**... to generate an auth-code and paste it here**.
+After initial authorization, this code is not needed anymore.
+It is recommended to clear this configuration parameter afterwards.
+* **Pitch** - The pitch of selected voice, up to 20 semitones.
+* **Volume Gain** - The volume of the output between 16dB and -96dB.
+* **Speaking Rate** - The speaking rate can be 4x faster or slower than the normal rate.
+* **Purge Cache** - Purges the cache e.g. after testing different voice configuration parameters.
+
+When enabled the cache is purged once.
+Make sure to disable this setting again so the cache is maintained after restarts.
+
+## Voice Configuration
+
+Using your favorite configuration UI:
+
+* Edit **System** settings.
+* Edit **Voice** settings.
+* Set **Google Cloud** as **Default Text-to-Speech**.
+* Choose default voice for the setup.
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.voice.googletts</artifactId>
+
+ <name>openHAB Add-ons :: Bundles :: Voice :: Google Cloud Text-to-Speech</name>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<features name="org.openhab.voice.googletts-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
+
+ <feature name="openhab-voice-googletts" description="Google Cloud Text-to-Speech" version="${project.version}">
+ <feature>openhab-runtime-base</feature>
+ <bundle start-level="80">mvn:org.openhab.addons.bundles/org.openhab.voice.googletts/${project.version}</bundle>
+ </feature>
+</features>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+
+/**
+ * Thrown, if an authentication error is given.
+ *
+ * @author Christoph Weitkamp - Initial contribution
+ *
+ */
+@NonNullByDefault
+public class AuthenticationException extends Exception {
+
+ private static final long serialVersionUID = 1L;
+
+ public AuthenticationException() {
+ }
+
+ public AuthenticationException(String message) {
+ super(message);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal;
+
+import java.io.File;
+import java.io.FileNotFoundException;
+import java.io.FileOutputStream;
+import java.io.IOException;
+import java.math.BigInteger;
+import java.nio.charset.StandardCharsets;
+import java.nio.file.Files;
+import java.security.MessageDigest;
+import java.security.NoSuchAlgorithmException;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Base64;
+import java.util.Collections;
+import java.util.Dictionary;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import org.eclipse.jdt.annotation.Nullable;
+import org.eclipse.jetty.http.HttpHeader;
+import org.eclipse.jetty.http.MimeTypes;
+import org.openhab.core.audio.AudioFormat;
+import org.openhab.core.auth.client.oauth2.AccessTokenResponse;
+import org.openhab.core.auth.client.oauth2.OAuthClientService;
+import org.openhab.core.auth.client.oauth2.OAuthException;
+import org.openhab.core.auth.client.oauth2.OAuthFactory;
+import org.openhab.core.auth.client.oauth2.OAuthResponseException;
+import org.openhab.core.io.net.http.HttpRequestBuilder;
+import org.openhab.voice.googletts.internal.protocol.AudioConfig;
+import org.openhab.voice.googletts.internal.protocol.AudioEncoding;
+import org.openhab.voice.googletts.internal.protocol.ListVoicesResponse;
+import org.openhab.voice.googletts.internal.protocol.SsmlVoiceGender;
+import org.openhab.voice.googletts.internal.protocol.SynthesisInput;
+import org.openhab.voice.googletts.internal.protocol.SynthesizeSpeechRequest;
+import org.openhab.voice.googletts.internal.protocol.SynthesizeSpeechResponse;
+import org.openhab.voice.googletts.internal.protocol.Voice;
+import org.openhab.voice.googletts.internal.protocol.VoiceSelectionParams;
+import org.osgi.service.cm.Configuration;
+import org.osgi.service.cm.ConfigurationAdmin;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+
+/**
+ * Google Cloud TTS API call implementation.
+ *
+ * @author Gabor Bicskei - Initial contribution and API
+ */
+class GoogleCloudAPI {
+
+ private static final char EXTENSION_SEPARATOR = '.';
+ private static final char UNIX_SEPARATOR = '/';
+ private static final char WINDOWS_SEPARATOR = '\\';
+
+ private static final String BEARER = "Bearer ";
+
+ private static final String GCP_AUTH_URI = "https://accounts.google.com/o/oauth2/auth";
+ private static final String GCP_TOKEN_URI = "https://accounts.google.com/o/oauth2/token";
+ private static final String GCP_REDIRECT_URI = "urn:ietf:wg:oauth:2.0:oob";
+ /**
+ * Google Cloud Platform authorization scope
+ */
+ private static final String GCP_SCOPE = "https://www.googleapis.com/auth/cloud-platform";
+
+ /**
+ * URL used for retrieving the list of available voices
+ */
+ private static final String LIST_VOICES_URL = "https://texttospeech.googleapis.com/v1/voices";
+
+ /**
+ * URL used for synthesizing text to speech
+ */
+ private static final String SYTNHESIZE_SPEECH_URL = "https://texttospeech.googleapis.com/v1/text:synthesize";
+
+ /**
+ * Logger
+ */
+ private final Logger logger = LoggerFactory.getLogger(GoogleCloudAPI.class);
+
+ /**
+ * Supported voices and locales
+ */
+ private final Map<Locale, Set<GoogleTTSVoice>> voices = new HashMap<>();
+
+ /**
+ * Cache folder
+ */
+ private File cacheFolder;
+
+ /**
+ * Configuration
+ */
+ private @Nullable GoogleTTSConfig config;
+
+ /**
+ * Status flag
+ */
+ private boolean initialized;
+
+ private final Gson gson = new GsonBuilder().create();
+ private final ConfigurationAdmin configAdmin;
+ private final OAuthFactory oAuthFactory;
+
+ private @Nullable OAuthClientService oAuthService;
+
+ /**
+ * Constructor.
+ *
+ * @param cacheFolder Service cache folder
+ */
+ GoogleCloudAPI(ConfigurationAdmin configAdmin, OAuthFactory oAuthFactory, File cacheFolder) {
+ this.configAdmin = configAdmin;
+ this.oAuthFactory = oAuthFactory;
+ this.cacheFolder = cacheFolder;
+ }
+
+ /**
+ * Configuration update.
+ *
+ * @param config New configuration.
+ */
+ void setConfig(GoogleTTSConfig config) {
+ this.config = config;
+
+ String clientId = config.clientId;
+ String clientSecret = config.clientSecret;
+ if (clientId != null && !clientId.isEmpty() && clientSecret != null && !clientSecret.isEmpty()) {
+ try {
+ final OAuthClientService oAuthService = oAuthFactory.createOAuthClientService(
+ GoogleTTSService.SERVICE_PID, GCP_TOKEN_URI, GCP_AUTH_URI, clientId, clientSecret, GCP_SCOPE,
+ false);
+ this.oAuthService = oAuthService;
+ getAccessToken();
+ initialized = true;
+ initVoices();
+ } catch (AuthenticationException | IOException ex) {
+ logger.warn("Error initializing Google Cloud TTS service: {}", ex.getMessage());
+ oAuthService = null;
+ initialized = false;
+ voices.clear();
+ }
+ } else {
+ oAuthService = null;
+ initialized = false;
+ voices.clear();
+ }
+
+ // maintain cache
+ if (config.purgeCache) {
+ File[] files = cacheFolder.listFiles();
+ if (files != null && files.length > 0) {
+ Arrays.stream(files).forEach(File::delete);
+ }
+ logger.debug("Cache purged.");
+ }
+ }
+
+ /**
+ * Fetches the OAuth2 tokens from Google Cloud Platform if the auth-code is set in the configuration. If successful
+ * the auth-code will be removed from the configuration.
+ */
+ private void getAccessToken() throws AuthenticationException, IOException {
+ String authcode = config.authcode;
+ if (authcode != null && !authcode.isEmpty()) {
+ logger.debug("Trying to get access and refresh tokens.");
+ try {
+ oAuthService.getAccessTokenResponseByAuthorizationCode(authcode, GCP_REDIRECT_URI);
+ } catch (OAuthException | OAuthResponseException ex) {
+ logger.debug("Error fetching access token: {}", ex.getMessage(), ex);
+ throw new AuthenticationException(
+ "Error fetching access token. Invalid authcode? Please generate a new one.");
+ }
+
+ config.authcode = null;
+
+ try {
+ Configuration serviceConfig = configAdmin.getConfiguration(GoogleTTSService.SERVICE_PID);
+ Dictionary<String, Object> configProperties = serviceConfig.getProperties();
+ if (configProperties != null) {
+ configProperties.put(GoogleTTSService.PARAM_AUTHCODE, "");
+ serviceConfig.update(configProperties);
+ }
+ } catch (IOException e) {
+ // should not happen
+ logger.warn(
+ "Failed to update configuration for Google Cloud TTS service. Please clear the 'authcode' configuration parameter manualy.");
+ }
+ }
+ }
+
+ private String getAuthorizationHeader() throws AuthenticationException, IOException {
+ final AccessTokenResponse accessTokenResponse;
+ try {
+ accessTokenResponse = oAuthService.getAccessTokenResponse();
+ } catch (OAuthException | OAuthResponseException ex) {
+ logger.debug("Error fetching access token: {}", ex.getMessage(), ex);
+ throw new AuthenticationException(
+ "Error fetching access token. Invalid authcode? Please generate a new one.");
+ }
+ if (accessTokenResponse == null || accessTokenResponse.getAccessToken() == null
+ || accessTokenResponse.getAccessToken().isEmpty()) {
+ throw new AuthenticationException("No access token. Is this thing authorized?");
+ }
+ return BEARER + accessTokenResponse.getAccessToken();
+ }
+
+ /**
+ * Loads supported audio formats
+ *
+ * @return Set of audio formats
+ */
+ Set<String> getSupportedAudioFormats() {
+ Set<String> formats = new HashSet<>();
+ for (AudioEncoding audioEncoding : AudioEncoding.values()) {
+ if (audioEncoding != AudioEncoding.AUDIO_ENCODING_UNSPECIFIED) {
+ formats.add(audioEncoding.toString());
+ }
+ }
+ return formats;
+ }
+
+ /**
+ * Supported locales.
+ *
+ * @return Set of locales
+ */
+ Set<Locale> getSupportedLocales() {
+ return voices.keySet();
+ }
+
+ /**
+ * Supported voices for locale.
+ *
+ * @param locale Locale
+ * @return Set of voices
+ */
+ Set<GoogleTTSVoice> getVoicesForLocale(Locale locale) {
+ Set<GoogleTTSVoice> localeVoices = voices.get(locale);
+ return localeVoices != null ? localeVoices : Collections.emptySet();
+ }
+
+ /**
+ * Google API call to load locales and voices.
+ */
+ private void initVoices() throws AuthenticationException, IOException {
+ if (oAuthService != null) {
+ voices.clear();
+ for (GoogleTTSVoice voice : listVoices()) {
+ Locale locale = voice.getLocale();
+ Set<GoogleTTSVoice> localeVoices;
+ if (!voices.containsKey(locale)) {
+ localeVoices = new HashSet<>();
+ voices.put(locale, localeVoices);
+ } else {
+ localeVoices = voices.get(locale);
+ }
+ localeVoices.add(voice);
+ }
+ } else {
+ logger.error("Google client is not initialized!");
+ }
+ }
+
+ @SuppressWarnings("null")
+ private List<GoogleTTSVoice> listVoices() throws AuthenticationException, IOException {
+ HttpRequestBuilder builder = HttpRequestBuilder.getFrom(LIST_VOICES_URL)
+ .withHeader(HttpHeader.AUTHORIZATION.name(), getAuthorizationHeader());
+
+ ListVoicesResponse listVoicesResponse = gson.fromJson(builder.getContentAsString(), ListVoicesResponse.class);
+
+ if (listVoicesResponse == null || listVoicesResponse.getVoices() == null) {
+ return Collections.emptyList();
+ }
+
+ List<GoogleTTSVoice> result = new ArrayList<>();
+ for (Voice voice : listVoicesResponse.getVoices()) {
+ for (String languageCode : voice.getLanguageCodes()) {
+ result.add(new GoogleTTSVoice(Locale.forLanguageTag(languageCode), voice.getName(),
+ voice.getSsmlGender().name()));
+ }
+ }
+
+ return result;
+ }
+
+ /**
+ * Converts ESH audio format to Google parameters.
+ *
+ * @param codec Requested codec
+ * @return String array of Google audio format and the file extension to use.
+ */
+ private String[] getFormatForCodec(String codec) {
+ switch (codec) {
+ case AudioFormat.CODEC_MP3:
+ return new String[] { AudioEncoding.MP3.toString(), "mp3" };
+ case AudioFormat.CODEC_PCM_SIGNED:
+ return new String[] { AudioEncoding.LINEAR16.toString(), "wav" };
+ default:
+ throw new IllegalArgumentException("Audio format " + codec + " is not yet supported");
+ }
+ }
+
+ byte[] synthesizeSpeech(String text, GoogleTTSVoice voice, String codec) {
+ String[] format = getFormatForCodec(codec);
+ String fileNameInCache = getUniqueFilenameForText(text, voice.getTechnicalName());
+ File audioFileInCache = new File(cacheFolder, fileNameInCache + "." + format[1]);
+ try {
+ // check if in cache
+ if (audioFileInCache.exists()) {
+ logger.debug("Audio file {} was found in cache.", audioFileInCache.getName());
+ return Files.readAllBytes(audioFileInCache.toPath());
+ }
+
+ // if not in cache, get audio data and put to cache
+ byte[] audio = synthesizeSpeechByGoogle(text, voice, format[0]);
+ if (audio != null) {
+ saveAudioAndTextToFile(text, audioFileInCache, audio, voice.getTechnicalName());
+ }
+ return audio;
+ } catch (AuthenticationException ex) {
+ logger.warn("Error initializing Google Cloud TTS service: {}", ex.getMessage());
+ oAuthService = null;
+ initialized = false;
+ voices.clear();
+ return null;
+ } catch (FileNotFoundException ex) {
+ logger.warn("Could not write {} to cache", audioFileInCache, ex);
+ return null;
+ } catch (IOException ex) {
+ logger.error("Could not write {}Â to cache", audioFileInCache, ex);
+ return null;
+ }
+ }
+
+ /**
+ * Create cache entry.
+ *
+ * @param text Converted text.
+ * @param cacheFile Cache entry file.
+ * @param audio Byte array of the audio.
+ * @param voiceName Used voice
+ * @throws IOException in case of file handling exceptions
+ */
+ private void saveAudioAndTextToFile(String text, File cacheFile, byte[] audio, String voiceName)
+ throws IOException {
+ logger.debug("Caching audio file {}", cacheFile.getName());
+ try (FileOutputStream audioFileOutputStream = new FileOutputStream(cacheFile)) {
+ audioFileOutputStream.write(audio);
+ }
+
+ // write text to file for transparency too
+ // this allows to know which contents is in which audio file
+ String textFileName = removeExtension(cacheFile.getName()) + ".txt";
+ logger.debug("Caching text file {}", textFileName);
+ try (FileOutputStream textFileOutputStream = new FileOutputStream(new File(cacheFolder, textFileName))) {
+ // @formatter:off
+ StringBuilder sb = new StringBuilder("Config: ")
+ .append(config.toConfigString())
+ .append(",voice=")
+ .append(voiceName)
+ .append(System.lineSeparator())
+ .append("Text: ")
+ .append(text)
+ .append(System.lineSeparator());
+ // @formatter:on
+ textFileOutputStream.write(sb.toString().getBytes(StandardCharsets.UTF_8));
+ }
+ }
+
+ /**
+ * Removes the extension of a file name.
+ *
+ * @param fileName the file name to remove the extension of
+ * @return the filename without the extension
+ */
+ private String removeExtension(String fileName) {
+ int extensionPos = fileName.lastIndexOf(EXTENSION_SEPARATOR);
+ int lastSeparator = Math.max(fileName.lastIndexOf(UNIX_SEPARATOR), fileName.lastIndexOf(WINDOWS_SEPARATOR));
+ return lastSeparator > extensionPos ? fileName : fileName.substring(0, extensionPos);
+ }
+
+ /**
+ * Call Google service to synthesize the required text
+ *
+ * @param text Text to synthesize
+ * @param voice Voice parameter
+ * @param audioFormat Audio encoding format
+ * @return Audio input stream or {@code null} when encoding exceptions occur
+ */
+ @SuppressWarnings({ "null", "unused" })
+ private byte[] synthesizeSpeechByGoogle(String text, GoogleTTSVoice voice, String audioFormat)
+ throws AuthenticationException, IOException {
+ AudioConfig audioConfig = new AudioConfig(AudioEncoding.valueOf(audioFormat), config.pitch, config.speakingRate,
+ config.volumeGainDb);
+ SynthesisInput synthesisInput = new SynthesisInput(text);
+ VoiceSelectionParams voiceSelectionParams = new VoiceSelectionParams(voice.getLocale().getLanguage(),
+ voice.getLabel(), SsmlVoiceGender.valueOf(voice.getSsmlGender()));
+
+ SynthesizeSpeechRequest request = new SynthesizeSpeechRequest(audioConfig, synthesisInput,
+ voiceSelectionParams);
+
+ HttpRequestBuilder builder = HttpRequestBuilder.postTo(SYTNHESIZE_SPEECH_URL)
+ .withHeader(HttpHeader.AUTHORIZATION.name(), getAuthorizationHeader())
+ .withContent(gson.toJson(request), MimeTypes.Type.APPLICATION_JSON.name());
+
+ SynthesizeSpeechResponse synthesizeSpeechResponse = gson.fromJson(builder.getContentAsString(),
+ SynthesizeSpeechResponse.class);
+
+ if (synthesizeSpeechResponse == null) {
+ return null;
+ }
+
+ byte[] encodedBytes = synthesizeSpeechResponse.getAudioContent().getBytes(StandardCharsets.UTF_8);
+ return Base64.getDecoder().decode(encodedBytes);
+ }
+
+ /**
+ * Gets a unique filename for a give text, by creating a MD5 hash of it. It
+ * will be preceded by the locale.
+ * <p>
+ * Sample: "en-US_00a2653ac5f77063bc4ea2fee87318d3"
+ */
+ private String getUniqueFilenameForText(String text, String voiceName) {
+ try {
+ MessageDigest md = MessageDigest.getInstance("MD5");
+ byte[] bytesOfMessage = (config.toConfigString() + text).getBytes(StandardCharsets.UTF_8);
+ String fileNameHash = String.format("%032x", new BigInteger(1, md.digest(bytesOfMessage)));
+ return voiceName + "_" + fileNameHash;
+ } catch (NoSuchAlgorithmException ex) {
+ // should not happen
+ logger.error("Could not create MD5 hash for '{}'", text, ex);
+ return null;
+ }
+ }
+
+ boolean isInitialized() {
+ return initialized;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+
+/**
+ * Voice service implementation.
+ *
+ * @author Gabor Bicskei - Initial contribution
+ */
+@NonNullByDefault
+class GoogleTTSConfig {
+ /**
+ * Access to Google Cloud Platform
+ */
+ public @Nullable String clientId;
+ public @Nullable String clientSecret;
+ public @Nullable String authcode;
+
+ /**
+ * Pitch
+ */
+ public Double pitch = 0d;
+
+ /**
+ * Volume Gain
+ */
+ public Double volumeGainDb = 0d;
+
+ /**
+ * Speaking Rate
+ */
+ public Double speakingRate = 1d;
+
+ /**
+ * Purge cache after configuration changes.
+ */
+ public Boolean purgeCache = Boolean.FALSE;
+
+ @Override
+ public String toString() {
+ return "GoogleTTSConfig{pitch=" + pitch + ", speakingRate=" + speakingRate + ", volumeGainDb=" + volumeGainDb
+ + ", purgeCache=" + purgeCache + '}';
+ }
+
+ String toConfigString() {
+ return String.format("pitch=%f,speakingRate=%f,volumeGainDb=%f", pitch, speakingRate, volumeGainDb);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal;
+
+import static org.openhab.voice.googletts.internal.GoogleTTSService.*;
+
+import java.io.File;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.Locale;
+import java.util.Map;
+import java.util.Set;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.core.OpenHAB;
+import org.openhab.core.audio.AudioFormat;
+import org.openhab.core.audio.AudioStream;
+import org.openhab.core.audio.ByteArrayAudioStream;
+import org.openhab.core.auth.client.oauth2.OAuthFactory;
+import org.openhab.core.config.core.ConfigurableService;
+import org.openhab.core.voice.TTSException;
+import org.openhab.core.voice.TTSService;
+import org.openhab.core.voice.Voice;
+import org.openhab.voice.googletts.internal.protocol.AudioEncoding;
+import org.osgi.framework.Constants;
+import org.osgi.service.cm.ConfigurationAdmin;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Component;
+import org.osgi.service.component.annotations.Modified;
+import org.osgi.service.component.annotations.Reference;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Voice service implementation.
+ *
+ * @author Gabor Bicskei - Initial contribution
+ */
+@Component(configurationPid = SERVICE_PID, property = Constants.SERVICE_PID + "=" + SERVICE_PID)
+@ConfigurableService(category = SERVICE_CATEGORY, label = SERVICE_NAME
+ + " Text-to-Speech", description_uri = SERVICE_CATEGORY + ":" + SERVICE_ID)
+public class GoogleTTSService implements TTSService {
+ /**
+ * Service name
+ */
+ static final String SERVICE_NAME = "Google Cloud";
+
+ /**
+ * Service id
+ */
+ static final String SERVICE_ID = "googletts";
+
+ /**
+ * Service category
+ */
+ static final String SERVICE_CATEGORY = "voice";
+
+ /**
+ * Service pid
+ */
+ static final String SERVICE_PID = "org.openhab." + SERVICE_CATEGORY + "." + SERVICE_ID;
+
+ /**
+ * Cache folder under $userdata
+ */
+ private static final String CACHE_FOLDER_NAME = "cache";
+
+ /**
+ * Configuration parameters
+ */
+ private static final String PARAM_CLIENT_ID = "clientId";
+ private static final String PARAM_CLIEND_SECRET = "clientSecret";
+ static final String PARAM_AUTHCODE = "authcode";
+ private static final String PARAM_PITCH = "pitch";
+ private static final String PARAM_SPEAKING_RATE = "speakingRate";
+ private static final String PARAM_VOLUME_GAIN_DB = "volumeGainDb";
+ private static final String PARAM_PURGE_CACHE = "purgeCache";
+
+ /**
+ * Logger.
+ */
+ private final Logger logger = LoggerFactory.getLogger(GoogleTTSService.class);
+
+ /**
+ * Set of supported audio formats
+ */
+ private Set<AudioFormat> audioFormats = new HashSet<>();
+
+ /**
+ * Google Cloud TTS API implementation
+ */
+ private @NonNullByDefault({}) GoogleCloudAPI apiImpl;
+ private final ConfigurationAdmin configAdmin;
+ private final OAuthFactory oAuthFactory;
+
+ /**
+ * All voices for all supported locales
+ */
+ private Set<Voice> allVoices = new HashSet<>();
+
+ private final GoogleTTSConfig config = new GoogleTTSConfig();
+
+ @Activate
+ public GoogleTTSService(final @Reference ConfigurationAdmin configAdmin,
+ final @Reference OAuthFactory oAuthFactory) {
+ this.configAdmin = configAdmin;
+ this.oAuthFactory = oAuthFactory;
+ }
+
+ /**
+ * DS activate, with access to ConfigAdmin
+ */
+ @Activate
+ protected void activate(Map<String, Object> config) {
+ // create cache folder
+ File userData = new File(OpenHAB.getUserDataFolder());
+ File cacheFolder = new File(new File(userData, CACHE_FOLDER_NAME), SERVICE_PID);
+ if (!cacheFolder.exists()) {
+ cacheFolder.mkdirs();
+ }
+ logger.info("Using cache folder {}", cacheFolder.getAbsolutePath());
+
+ apiImpl = new GoogleCloudAPI(configAdmin, oAuthFactory, cacheFolder);
+ updateConfig(config);
+ }
+
+ /**
+ * Initializing audio formats. Google supports 3 formats:
+ * LINEAR16
+ * Uncompressed 16-bit signed little-endian samples (Linear PCM). Audio content returned as LINEAR16
+ * also contains a WAV header.
+ * MP3
+ * MP3 audio.
+ * OGG_OPUS
+ * Opus encoded audio wrapped in an ogg container. This is not supported by openHAB.
+ *
+ * @return Set of supported AudioFormats
+ */
+ private Set<AudioFormat> initAudioFormats() {
+ logger.trace("Initializing audio formats");
+ Set<AudioFormat> result = new HashSet<>();
+ for (String format : apiImpl.getSupportedAudioFormats()) {
+ AudioFormat audioFormat = getAudioFormat(format);
+ if (audioFormat != null) {
+ result.add(audioFormat);
+ logger.trace("Audio format supported: {}", format);
+ } else {
+ logger.trace("Audio format not supported: {}", format);
+ }
+ }
+ return Collections.unmodifiableSet(result);
+ }
+
+ /**
+ * Loads available voices from Google API
+ *
+ * @return Set of available voices.
+ */
+ private Set<Voice> initVoices() {
+ logger.trace("Initializing voices");
+ Set<Voice> result = new HashSet<>();
+ for (Locale locale : apiImpl.getSupportedLocales()) {
+ result.addAll(apiImpl.getVoicesForLocale(locale));
+ }
+ if (logger.isTraceEnabled()) {
+ for (Voice voice : result) {
+ logger.trace("Google Cloud TTS voice: {}", voice.getLabel());
+ }
+ }
+ return Collections.unmodifiableSet(result);
+ }
+
+ /**
+ * Called by the framework when the configuration was updated.
+ *
+ * @param newConfig Updated configuration
+ */
+ @Modified
+ private void updateConfig(Map<String, Object> newConfig) {
+ logger.debug("Updating configuration");
+ if (newConfig != null) {
+ // client id
+ String param = newConfig.containsKey(PARAM_CLIENT_ID) ? newConfig.get(PARAM_CLIENT_ID).toString() : null;
+ config.clientId = param;
+ if (param == null) {
+ logger.warn("Missing client id configuration to access Google Cloud TTS API.");
+ }
+ // client secret
+ param = newConfig.containsKey(PARAM_CLIEND_SECRET) ? newConfig.get(PARAM_CLIEND_SECRET).toString() : null;
+ config.clientSecret = param;
+ if (param == null) {
+ logger.warn("Missing client secret configuration to access Google Cloud TTS API.");
+ }
+ // authcode
+ param = newConfig.containsKey(PARAM_AUTHCODE) ? newConfig.get(PARAM_AUTHCODE).toString() : null;
+ config.authcode = param;
+
+ // pitch
+ param = newConfig.containsKey(PARAM_PITCH) ? newConfig.get(PARAM_PITCH).toString() : null;
+ if (param != null) {
+ config.pitch = Double.parseDouble(param);
+ }
+
+ // speakingRate
+ param = newConfig.containsKey(PARAM_SPEAKING_RATE) ? newConfig.get(PARAM_SPEAKING_RATE).toString() : null;
+ if (param != null) {
+ config.speakingRate = Double.parseDouble(param);
+ }
+
+ // volumeGainDb
+ param = newConfig.containsKey(PARAM_VOLUME_GAIN_DB) ? newConfig.get(PARAM_VOLUME_GAIN_DB).toString() : null;
+ if (param != null) {
+ config.volumeGainDb = Double.parseDouble(param);
+ }
+
+ // purgeCache
+ param = newConfig.containsKey(PARAM_PURGE_CACHE) ? newConfig.get(PARAM_PURGE_CACHE).toString() : null;
+ if (param != null) {
+ config.purgeCache = Boolean.parseBoolean(param);
+ }
+ logger.trace("New configuration: {}", config.toString());
+
+ if (config.clientId != null && !config.clientId.isEmpty() && config.clientSecret != null
+ && !config.clientSecret.isEmpty()) {
+ apiImpl.setConfig(config);
+ if (apiImpl.isInitialized()) {
+ allVoices = initVoices();
+ audioFormats = initAudioFormats();
+ }
+ }
+ } else {
+ logger.warn("Missing Google Cloud TTS configuration.");
+ }
+ }
+
+ @Override
+ public String getId() {
+ return SERVICE_ID;
+ }
+
+ @Override
+ public String getLabel(@Nullable Locale locale) {
+ return SERVICE_NAME;
+ }
+
+ @Override
+ public Set<Voice> getAvailableVoices() {
+ return allVoices;
+ }
+
+ @Override
+ public Set<AudioFormat> getSupportedFormats() {
+ return audioFormats;
+ }
+
+ /**
+ * Helper to create AudioFormat objects from Google names.
+ *
+ * @param format Google audio format.
+ * @return Audio format object.
+ */
+ private @Nullable AudioFormat getAudioFormat(String format) {
+ Integer bitDepth = 16;
+ Long frequency = 44100L;
+
+ AudioEncoding encoding = AudioEncoding.valueOf(format);
+
+ switch (encoding) {
+ case MP3:
+ // we use by default: MP3, 44khz_16bit_mono with bitrate 64 kbps
+ return new AudioFormat(AudioFormat.CONTAINER_NONE, AudioFormat.CODEC_MP3, null, bitDepth, 64000,
+ frequency);
+ case LINEAR16:
+ // we use by default: wav, 44khz_16bit_mono
+ return new AudioFormat(AudioFormat.CONTAINER_WAVE, AudioFormat.CODEC_PCM_SIGNED, null, bitDepth, null,
+ frequency);
+ default:
+ logger.warn("Audio format {} is not yet supported.", format);
+ return null;
+ }
+ }
+
+ /**
+ * Checks parameters and calls the API to synthesize voice.
+ *
+ * @param text Input text.
+ * @param voice Selected voice.
+ * @param requestedFormat Format that is supported by the target sink as well.
+ * @return Output audio stream
+ * @throws TTSException in case the service is unavailable or a parameter is invalid.
+ */
+ @Override
+ public AudioStream synthesize(String text, Voice voice, AudioFormat requestedFormat) throws TTSException {
+ logger.debug("Synthesize '{}' for voice '{}' in format {}", text, voice.getUID(), requestedFormat);
+ // Validate known api key
+ if (!apiImpl.isInitialized()) {
+ throw new TTSException("Missing service configuration.");
+ }
+ // Validate arguments
+ // trim text
+ String trimmedText = text.trim();
+ if (trimmedText.isEmpty()) {
+ throw new TTSException("The passed text is null or empty");
+ }
+ if (!this.allVoices.contains(voice)) {
+ throw new TTSException("The passed voice is unsupported");
+ }
+ boolean isAudioFormatSupported = false;
+ for (AudioFormat currentAudioFormat : this.audioFormats) {
+ if (currentAudioFormat.isCompatible(requestedFormat)) {
+ isAudioFormatSupported = true;
+ break;
+ }
+ }
+ if (!isAudioFormatSupported) {
+ throw new TTSException("The passed AudioFormat is unsupported");
+ }
+
+ // create the audio byte array for given text, locale, format
+ byte[] audio = apiImpl.synthesizeSpeech(trimmedText, (GoogleTTSVoice) voice, requestedFormat.getCodec());
+ if (audio == null) {
+ throw new TTSException("Could not read from Google Cloud TTS Service");
+ }
+ return new ByteArrayAudioStream(audio, requestedFormat);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal;
+
+import java.util.Locale;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.openhab.core.voice.Voice;
+import org.openhab.voice.googletts.internal.protocol.SsmlVoiceGender;
+
+/**
+ * Implementation of the Voice interface for Google Cloud TTS Service.
+ *
+ * @author Gabor Bicskei - Initial contribution
+ */
+@NonNullByDefault
+public class GoogleTTSVoice implements Voice {
+
+ /**
+ * Voice locale
+ */
+ private final Locale locale;
+
+ /**
+ * Voice label
+ */
+ private final String label;
+
+ /**
+ * Gender
+ */
+ private final String ssmlGender;
+
+ /**
+ * Constructs a Google Cloud TTS Voice for the passed data
+ *
+ * @param locale The Locale of the voice
+ * @param label The label of the voice
+ * @param ssmlGender Voice gender
+ */
+ GoogleTTSVoice(Locale locale, String label, String ssmlGender) {
+ this.locale = locale;
+ this.ssmlGender = ssmlGender;
+ this.label = label;
+ }
+
+ /**
+ * Globally unique identifier of the voice.
+ *
+ * @return A String uniquely identifying the voice globally
+ */
+ @Override
+ public String getUID() {
+ return "googletts:" + getTechnicalName();
+ }
+
+ /**
+ * Technical name of the voice.
+ *
+ * @return A String voice technical name
+ */
+ String getTechnicalName() {
+ return label.replaceAll("[^a-zA-Z0-9_]", "");
+ }
+
+ /**
+ * The voice label, used for GUI's or VUI's
+ *
+ * @return The voice label, may not be globally unique
+ */
+ @Override
+ public String getLabel() {
+ return this.label;
+ }
+
+ /**
+ * @inheritDoc
+ */
+ @Override
+ public Locale getLocale() {
+ return this.locale;
+ }
+
+ /**
+ * The voice gender.
+ *
+ * @return {@link SsmlVoiceGender} enum name.
+ */
+ String getSsmlGender() {
+ return ssmlGender;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * The configuration of the synthesized audio.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public class AudioConfig {
+
+ /**
+ * Required. The format of the requested audio byte stream.
+ */
+ private AudioEncoding audioEncoding;
+
+ /**
+ * Optional speaking pitch, in the range [-20.0, 20.0]. 20 means increase 20 semitones from the original pitch. -20
+ * means decrease 20 semitones from the original pitch.
+ */
+ private Double pitch;
+
+ /**
+ * The synthesis sample rate (in hertz) for this audio. Optional. If this is different from the voice's natural
+ * sample rate, then the synthesizer will honor this request by converting to the desired sample rate (which might
+ * result in worse audio quality), unless the specified sample rate is not supported for the encoding chosen, in
+ * which case it will fail the request and return google.rpc.Code.INVALID_ARGUMENT.
+ */
+ private Long sampleRateHertz;
+
+ /**
+ * Optional speaking rate/speed, in the range [0.25, 4.0]. 1.0 is the normal native speed supported by the specific
+ * voice. 2.0 is twice as fast, and 0.5 is half as fast. If unset(0.0), defaults to the native 1.0 speed. Any other
+ * values < 0.25 or > 4.0 will return an error.
+ */
+ private Double speakingRate;
+
+ /**
+ * Optional volume gain (in dB) of the normal native volume supported by the specific voice, in the range [-96.0,
+ * 16.0]. If unset, or set to a value of 0.0 (dB), will play at normal native signal amplitude. A value of -6.0 (dB)
+ * will play at approximately half the amplitude of the normal native signal amplitude. A value of +6.0 (dB) will
+ * play at approximately twice the amplitude of the normal native signal amplitude. Strongly recommend not to exceed
+ * +10 (dB) as there's usually no effective increase in loudness for any value greater than that.
+ */
+ private Double volumeGainDb;
+
+ public AudioConfig() {
+ }
+
+ public AudioConfig(AudioEncoding audioEncoding, Double pitch, Double speakingRate, Double volumeGainDb) {
+ this(audioEncoding, pitch, null, speakingRate, volumeGainDb);
+ }
+
+ public AudioConfig(AudioEncoding audioEncoding, Double pitch, Long sampleRateHertz, Double speakingRate,
+ Double volumeGainDb) {
+ this.audioEncoding = audioEncoding;
+ this.pitch = pitch;
+ this.sampleRateHertz = sampleRateHertz;
+ this.speakingRate = speakingRate;
+ this.volumeGainDb = volumeGainDb;
+ }
+
+ public AudioEncoding getAudioEncoding() {
+ return audioEncoding;
+ }
+
+ public Double getPitch() {
+ return pitch;
+ }
+
+ public Long getSampleRateHertz() {
+ return sampleRateHertz;
+ }
+
+ public Double getSpeakingRate() {
+ return speakingRate;
+ }
+
+ public Double getVolumeGainDb() {
+ return volumeGainDb;
+ }
+
+ public void setAudioEncoding(AudioEncoding audioEncoding) {
+ this.audioEncoding = audioEncoding;
+ }
+
+ public void setPitch(Double pitch) {
+ this.pitch = pitch;
+ }
+
+ public void setSampleRateHertz(Long sampleRateHertz) {
+ this.sampleRateHertz = sampleRateHertz;
+ }
+
+ public void setSpeakingRate(Double speakingRate) {
+ this.speakingRate = speakingRate;
+ }
+
+ public void setVolumeGainDb(Double volumeGainDb) {
+ this.volumeGainDb = volumeGainDb;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * Configuration to set up audio encoder.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public enum AudioEncoding {
+
+ /**
+ * Not specified.
+ */
+ AUDIO_ENCODING_UNSPECIFIED,
+
+ /**
+ * Uncompressed 16-bit signed little-endian samples (Linear PCM). Audio content returned as LINEAR16 also contains a
+ * WAV header.
+ */
+ LINEAR16,
+
+ /**
+ * MP3 audio.
+ */
+ MP3,
+
+ /**
+ * Opus encoded audio wrapped in an ogg container. The result will be a file which can be played natively on
+ * Android, and in browsers (at least Chrome and Firefox). The quality of the encoding is considerably higher than
+ * MP3 while using approximately the same bitrate.
+ */
+ OGG_OPUS
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+import java.util.List;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+
+/**
+ * The message returned to the client by the voices.list method.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+@NonNullByDefault
+public class ListVoicesResponse {
+
+ /**
+ * The list of voices.
+ */
+ private @Nullable List<Voice> voices;
+
+ public @Nullable List<Voice> getVoices() {
+ return voices;
+ }
+
+ public void setVoices(List<Voice> voices) {
+ this.voices = voices;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * Gender of the voice as described in SSML voice element.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public enum SsmlVoiceGender {
+
+ /**
+ * An unspecified gender. In VoiceSelectionParams, this means that the client doesn't care which gender the selected
+ * voice will have. In the Voice field of ListVoicesResponse, this may mean that the voice doesn't fit any of the
+ * other categories in this enum, or that the gender of the voice isn't known.
+ */
+ SSML_VOICE_GENDER_UNSPECIFIED,
+
+ /**
+ * A male voice.
+ */
+ MALE,
+
+ /**
+ * A female voice.
+ */
+ FEMALE,
+
+ /**
+ * A gender-neutral voice.
+ */
+ NEUTRAL
+
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * Contains text input to be synthesized. Either text or ssml must be supplied. Supplying both or neither returns
+ * google.rpc.Code.INVALID_ARGUMENT. The input size is limited to 5000 characters.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public class SynthesisInput {
+
+ /**
+ * The SSML document to be synthesized. The SSML document must be valid and well-formed. Otherwise the RPC will fail
+ * and return google.rpc.Code.INVALID_ARGUMENT.
+ */
+ private String ssml;
+
+ /**
+ * The raw text to be synthesized.
+ */
+ private String text;
+
+ public SynthesisInput() {
+ }
+
+ public SynthesisInput(String text) {
+ if (text.startsWith("<speak>")) {
+ ssml = text;
+ } else {
+ this.text = text;
+ }
+ }
+
+ public String getSsml() {
+ return ssml;
+ }
+
+ public String getText() {
+ return text;
+ }
+
+ public void setSsml(String ssml) {
+ this.ssml = ssml;
+ }
+
+ public void setText(String text) {
+ this.text = text;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * Synthesizes speech synchronously: receive results after all text input has been processed.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public class SynthesizeSpeechRequest {
+
+ /**
+ * Required. The configuration of the synthesized audio.
+ */
+ private AudioConfig audioConfig = new AudioConfig();
+
+ /**
+ * Required. The Synthesizer requires either plain text or SSML as input.
+ */
+ private SynthesisInput input = new SynthesisInput();
+
+ /**
+ * Required. The desired voice of the synthesized audio.
+ */
+ private VoiceSelectionParams voice = new VoiceSelectionParams();
+
+ public SynthesizeSpeechRequest() {
+ }
+
+ public SynthesizeSpeechRequest(AudioConfig audioConfig, SynthesisInput input, VoiceSelectionParams voice) {
+ this.audioConfig = audioConfig;
+ this.input = input;
+ this.voice = voice;
+ }
+
+ public AudioConfig getAudioConfig() {
+ return audioConfig;
+ }
+
+ public SynthesisInput getInput() {
+ return input;
+ }
+
+ public VoiceSelectionParams getVoice() {
+ return voice;
+ }
+
+ public void setAudioConfig(AudioConfig audioConfig) {
+ this.audioConfig = audioConfig;
+ }
+
+ public void setInput(SynthesisInput input) {
+ this.input = input;
+ }
+
+ public void setVoice(VoiceSelectionParams voice) {
+ this.voice = voice;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * The message returned to the client by the text.synthesize method.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public class SynthesizeSpeechResponse {
+
+ /**
+ * The audio data bytes encoded as specified in the request, including the header (For LINEAR16 audio, we include
+ * the WAV header). Note: as with all bytes fields, protobuffers use a pure binary representation, whereas JSON
+ * representations use base64.
+ *
+ * A base64-encoded string.
+ */
+ private String audioContent;
+
+ public String getAudioContent() {
+ return audioContent;
+ }
+
+ public void setAudioContent(String audioContent) {
+ this.audioContent = audioContent;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+import java.util.List;
+
+/**
+ * Description of a voice supported by the TTS service.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public class Voice {
+
+ /**
+ * The languages that this voice supports, expressed as BCP-47 language tags (e.g. "en-US", "es-419", "cmn-tw").
+ */
+ private List<String> languageCodes;
+
+ /**
+ * The name of this voice. Each distinct voice has a unique name.
+ */
+ private String name;
+
+ /**
+ * The natural sample rate (in hertz) for this voice.
+ */
+ private Long naturalSampleRateHertz;
+
+ /**
+ * The gender of this voice.
+ */
+ private SsmlVoiceGender ssmlGender;
+
+ public List<String> getLanguageCodes() {
+ return languageCodes;
+ }
+
+ public void setLanguageCodes(List<String> languageCodes) {
+ this.languageCodes = languageCodes;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public Long getNaturalSampleRateHertz() {
+ return naturalSampleRateHertz;
+ }
+
+ public void setNaturalSampleRateHertz(Long naturalSampleRateHertz) {
+ this.naturalSampleRateHertz = naturalSampleRateHertz;
+ }
+
+ public SsmlVoiceGender getSsmlGender() {
+ return ssmlGender;
+ }
+
+ public void setSsmlGender(SsmlVoiceGender ssmlGender) {
+ this.ssmlGender = ssmlGender;
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.voice.googletts.internal.protocol;
+
+/**
+ * Description of which voice to use for a synthesis request.
+ *
+ * @author Wouter Born - Initial contribution
+ */
+public class VoiceSelectionParams {
+
+ /**
+ * The language (and optionally also the region) of the voice expressed as a BCP-47 language tag, e.g. "en-US".
+ * Required. This should not include a script tag (e.g. use "cmn-cn" rather than "cmn-Hant-cn"), because the script
+ * will be inferred from the input provided in the SynthesisInput. The TTS service will use this parameter to help
+ * choose an appropriate voice. Note that the TTS service may choose a voice with a slightly different language code
+ * than the one selected; it may substitute a different region (e.g. using en-US rather than en-CA if there isn't a
+ * Canadian voice available), or even a different language, e.g. using "nb" (Norwegian Bokmal) instead of "no"
+ * (Norwegian)".
+ */
+ private String languageCode;
+
+ /**
+ * The name of the voice. Optional; if not set, the service will choose a voice based on the other parameters such
+ * as languageCode and gender.
+ */
+ private String name;
+
+ /**
+ * The preferred gender of the voice. Optional; if not set, the service will choose a voice based on the other
+ * parameters such as languageCode and name. Note that this is only a preference, not requirement; if a voice of the
+ * appropriate gender is not available, the synthesizer should substitute a voice with a different gender rather
+ * than failing the request.
+ */
+ private SsmlVoiceGender ssmlGender;
+
+ public VoiceSelectionParams() {
+ }
+
+ public VoiceSelectionParams(String languageCode, String name, SsmlVoiceGender ssmlGender) {
+ this.languageCode = languageCode;
+ this.name = name;
+ this.ssmlGender = ssmlGender;
+ }
+
+ public String getLanguageCode() {
+ return languageCode;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public SsmlVoiceGender getSsmlGender() {
+ return ssmlGender;
+ }
+
+ public void setLanguageCode(String languageCode) {
+ this.languageCode = languageCode;
+ }
+
+ public void setName(String name) {
+ this.name = name;
+ }
+
+ public void setSsmlGender(SsmlVoiceGender ssmlGender) {
+ this.ssmlGender = ssmlGender;
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<config-description:config-descriptions
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:config-description="https://openhab.org/schemas/config-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/config-description/v1.0.0
+ https://openhab.org/schemas/config-description-1.0.0.xsd">
+
+ <config-description uri="voice:googletts">
+ <parameter-group name="authentication">
+ <label>Authentication</label>
+ <description>Authentication for connecting to Google Cloud Platform.</description>
+ </parameter-group>
+ <parameter-group name="tts">
+ <label>TTS Configuration</label>
+ <description>Parameters for Google Cloud TTS API.</description>
+ </parameter-group>
+
+ <parameter name="clientId" type="text" required="true" groupName="authentication">
+ <label>Client Id</label>
+ <description>Google Cloud Platform OAuth 2.0-Client Id.</description>
+ </parameter>
+ <parameter name="clientSecret" type="text" required="true" groupName="authentication">
+ <context>Password</context>
+ <label>Client Secret</label>
+ <description>Google Cloud Platform OAuth 2.0-Client Secret.</description>
+ </parameter>
+ <parameter name="authcode" type="text" groupName="authentication">
+ <label>Authorization Code</label>
+ <description><![CDATA[The auth-code is a one-time code needed to retrieve the necessary access-codes from Google Cloud Platform. <b>Please go to your browser ...</b> https://accounts.google.com/o/oauth2/auth?client_id={{clientId}}&redirect_uri=urn:ietf:wg:oauth:2.0:oob&scope=https://www.googleapis.com/auth/cloud-platform&response_type=code <b>... to generate an auth-code and paste it here</b>.]]></description>
+ </parameter>
+ <parameter name="pitch" type="decimal" min="-20" max="20" step="0.1" groupName="tts">
+ <label>Pitch</label>
+ <description>Customize the pitch of your selected voice, up to 20 semitones more or less than the default output.</description>
+ <default>0</default>
+ </parameter>
+ <parameter name="volumeGain" type="decimal" min="-96" max="16" groupName="tts">
+ <label>Volume Gain</label>
+ <description>Increase the volume of the output by up to 16db or decrease the volume up to -96db.</description>
+ <default>0</default>
+ </parameter>
+ <parameter name="speakingRate" type="decimal" min="0.25" max="4" groupName="tts">
+ <label>Speaking Rate</label>
+ <description>Speaking rate can be 4x faster or slower than the normal rate.</description>
+ <default>1</default>
+ </parameter>
+ <parameter name="purgeCache" type="boolean">
+ <advanced>true</advanced>
+ <label>Purge Cache</label>
+ <description>Purges the cache e.g. after testing different voice configuration parameters. When enabled the cache is
+ purged once. Make sure to disable this setting again so the cache is maintained after restarts.</description>
+ <default>false</default>
+ </parameter>
+ </config-description>
+
+</config-description:config-descriptions>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons</groupId>
+ <artifactId>org.openhab.addons.reactor</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.addons.reactor.bundles</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: Bundles</name>
+
+ <modules>
+ <module>org.openhab.binding.nest</module>
+ <module>org.openhab.persistence.dynamodb</module>
+ <module>org.openhab.persistence.influxdb</module>
+ <module>org.openhab.persistence.jdbc</module>
+ <module>org.openhab.persistence.jpa</module>
+ <module>org.openhab.persistence.mapdb</module>
+ <module>org.openhab.persistence.mongodb</module>
+ <module>org.openhab.persistence.rrd4j</module>
+ <module>org.openhab.voice.googletts</module>
+ </modules>
+
+ <dependencies>
+ <!-- openHAB core -->
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.compile</artifactId>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.openhab-core</artifactId>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.test</artifactId>
+ <type>pom</type>
+ <scope>test</scope>
+ </dependency>
+ <!-- Distribution -->
+ <dependency>
+ <groupId>org.apache.karaf.features</groupId>
+ <artifactId>framework</artifactId>
+ <version>${karaf.version}</version>
+ <type>kar</type>
+ <optional>true</optional>
+ <exclusions>
+ <exclusion>
+ <groupId>*</groupId>
+ <artifactId>*</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+ <!-- Repositories -->
+ <dependency>
+ <groupId>org.apache.karaf.features</groupId>
+ <artifactId>standard</artifactId>
+ <version>${karaf.version}</version>
+ <classifier>features</classifier>
+ <type>xml</type>
+ <scope>provided</scope>
+ </dependency>
+ </dependencies>
+
+ <properties>
+ <dep.noembedding/>
+ </properties>
+
+ <build>
+ <pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <configuration>
+ <archive>
+ <manifestFile>${project.build.outputDirectory}/META-INF/MANIFEST.MF</manifestFile>
+ </archive>
+ <skipIfEmpty>true</skipIfEmpty>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.karaf.tooling</groupId>
+ <artifactId>karaf-maven-plugin</artifactId>
+ <version>${karaf.version}</version>
+ <extensions>true</extensions>
+ <configuration>
+ <startLevel>80</startLevel>
+ <aggregateFeatures>true</aggregateFeatures>
+ <checkDependencyChange>true</checkDependencyChange>
+ <failOnDependencyChange>false</failOnDependencyChange>
+ <logDependencyChanges>true</logDependencyChanges>
+ <overwriteChangedDependencies>true</overwriteChangedDependencies>
+ </configuration>
+ <executions>
+ <execution>
+ <id>compile</id>
+ <goals>
+ <goal>features-generate-descriptor</goal>
+ </goals>
+ <phase>generate-resources</phase>
+ <configuration>
+ <inputFile>${feature.directory}</inputFile>
+ </configuration>
+ </execution>
+ <execution>
+ <id>karaf-feature-verification</id>
+ <goals>
+ <goal>verify</goal>
+ </goals>
+ <phase>verify</phase>
+ <configuration>
+ <descriptors combine.children="append">
+ <!-- Apache Karaf -->
+ <descriptor>mvn:org.apache.karaf.features/framework/${karaf.version}/xml/features</descriptor>
+ <descriptor>mvn:org.apache.karaf.features/standard/${karaf.version}/xml/features</descriptor>
+ <!-- Current feature under verification -->
+ <descriptor>file:${project.build.directory}/feature/feature.xml</descriptor>
+ </descriptors>
+ <distribution>org.apache.karaf.features:framework</distribution>
+ <javase>${oh.java.version}</javase>
+ <framework>
+ <feature>framework</feature>
+ </framework>
+ <features>
+ <feature>openhab-*</feature>
+ </features>
+ <verifyTransitive>false</verifyTransitive>
+ <ignoreMissingConditions>true</ignoreMissingConditions>
+ <fail>first</fail>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-source-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>attach-sources</id>
+ <goals>
+ <goal>jar-no-fork</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.karaf.tooling</groupId>
+ <artifactId>karaf-maven-plugin</artifactId>
+ </plugin>
+ <!-- embed compile time dependencies by unpacking -->
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-dependency-plugin</artifactId>
+ <version>3.1.1</version>
+ <executions>
+ <execution>
+ <id>embed-dependencies</id>
+ <goals>
+ <goal>unpack-dependencies</goal>
+ </goals>
+ <configuration>
+ <includeScope>runtime</includeScope>
+ <includeTypes>jar</includeTypes>
+ <excludeGroupIds>javax.activation,org.apache.karaf.features</excludeGroupIds>
+ <excludeArtifactIds>${dep.noembedding}</excludeArtifactIds>
+ <outputDirectory>${project.build.directory}/classes</outputDirectory>
+ <overWriteReleases>true</overWriteReleases>
+ <overWriteSnapshots>true</overWriteSnapshots>
+ <excludeTransitive>true</excludeTransitive>
+ <type>jar</type>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+
+ <profiles>
+ <!-- remove unused classes / shrink jar -->
+ <profile>
+ <id>shrink-bundle</id>
+ <activation>
+ <file>
+ <exists>shrinkBundle.profile</exists>
+ </file>
+ </activation>
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>com.github.wvengen</groupId>
+ <artifactId>proguard-maven-plugin</artifactId>
+ <version>2.1.1</version>
+ <executions>
+ <execution>
+ <id>shrink-bundle</id>
+ <phase>package</phase>
+ <goals>
+ <goal>proguard</goal>
+ </goals>
+ </execution>
+ </executions>
+ <configuration>
+ <obfuscate>false</obfuscate>
+ <injarNotExistsSkip>true</injarNotExistsSkip>
+ <outputDirectory>${project.build.directory}</outputDirectory>
+ <libs>
+ <lib>${java.home}/lib/rt.jar</lib>
+ </libs>
+ <options>
+ <option>-dontwarn</option>
+ <option>-dontnote</option>
+ <option>-keep,includedescriptorclasses public class org.openhab.** { *; }</option>
+ <option>-printusage ${project.build.directory}/shrink_log.txt</option>
+ </options>
+ </configuration>
+ </plugin>
+ </plugins>
+ </build>
+ </profile>
+ <!-- suppress embedding of dependencies -->
+ <profile>
+ <id>no-embed-dependencies</id>
+ <activation>
+ <file>
+ <exists>noEmbedDependencies.profile</exists>
+ </file>
+ </activation>
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-dependency-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>embed-dependencies</id>
+ <phase>none</phase>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+ </profile>
+ </profiles>
+
+</project>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.addons.features.karaf.openhab-addons-external</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.features.karaf</groupId>
+ <artifactId>org.openhab.addons.reactor.features.karaf</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.features.karaf.openhab-addons-external3</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: Features :: Karaf :: Add-ons External</name>
+ <description>openHAB Add-ons External</description>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.codehaus.mojo</groupId>
+ <artifactId>build-helper-maven-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>attach-artifact</id>
+ <goals>
+ <goal>attach-artifact</goal>
+ </goals>
+ <phase>package</phase>
+ <configuration>
+ <artifacts>
+ <artifact>
+ <file>src/main/resources/conf/dynamodb.cfg</file>
+ <type>cfg</type>
+ <classifier>dynamodb</classifier>
+ </artifact>
+ <artifact>
+ <file>src/main/resources/conf/influxdb.cfg</file>
+ <type>cfg</type>
+ <classifier>influxdb</classifier>
+ </artifact>
+ <artifact>
+ <file>src/main/resources/conf/jdbc.cfg</file>
+ <type>cfg</type>
+ <classifier>jdbc</classifier>
+ </artifact>
+ <artifact>
+ <file>src/main/resources/conf/jpa.cfg</file>
+ <type>cfg</type>
+ <classifier>jpa</classifier>
+ </artifact>
+ <artifact>
+ <file>src/main/resources/conf/mapdb.cfg</file>
+ <type>cfg</type>
+ <classifier>mapdb</classifier>
+ </artifact>
+ <artifact>
+ <file>src/main/resources/conf/rrd4j.cfg</file>
+ <type>cfg</type>
+ <classifier>rrd4j</classifier>
+ </artifact>
+ </artifacts>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
--- /dev/null
+############################ Amazon DynamoDB Persistence Service ##################################
+#
+# The following parameters are used to configure Amazon DynamoDB Persistence.
+#
+# Further details at https://www.openhab.org/addons/persistence/dynamodb/
+#
+
+# PID SETTING
+#
+# When configuring the persistence using file (instead PaperUI),
+# make sure the first line in the configuration file is the
+# pid definition (remove the comment prefix #)
+
+#pid:pid:org.openhab.dynamodb
+
+
+#
+# CONNECTION SETTINGS (follow OPTION 1 or OPTION 2)
+#
+
+# OPTION 1 (using accessKey and secretKey)
+#accessKey=AKIAIOSFODNN7EXAMPLE
+#secretKey=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH
+#region=eu-west-1
+
+# OPTION 2 (using profilesConfigFile and profile)
+# where profilesConfigFile points to AWS credentials file
+# Please note that the user that runs openHAB must have approriate read rights to the credential file.
+#profilesConfigFile=/etc/openhab2/aws_creds
+#profile=fooprofile
+#region=eu-west-1
+
+# Credentials file example:
+#
+# [fooprofile]
+# aws_access_key_id=AKIAIOSFODNN7EXAMPLE
+# aws_secret_access_key=3+AAAAABBBbbbCCCCCCdddddd+7mnbIOLH
+
+
+#
+# ADVANCED CONFIGURATION (OPTIONAL)
+#
+
+# read capacity for the created tables
+#readCapacityUnits=1
+
+# write capacity for the created tables
+#writeCapacityUnits=1
+
+# table prefix used in the name of created tables
+#tablePrefix=openhab-
--- /dev/null
+# The database URL, e.g. http://127.0.0.1:8086 or https://127.0.0.1:8084 .
+# Defaults to: http://127.0.0.1:8086
+# url=http(s)://<host>:<port>
+
+# The name of the database user, e.g. openhab.
+# Defaults to: openhab
+# user=<user>
+
+# The password of the database user.
+# password=
+
+# The name of the database, e.g. openhab.
+# Defaults to: openhab
+# db=<database>
--- /dev/null
+############################ JDBC Persistence Service ##################################
+# I N S T A L L J D B C P E R S I S T E N C E S E R V I C E
+#
+# https://github.com/openhab/openhab/wiki/JDBC-Persistence
+#
+# Tested databases/url-prefix: jdbc:derby, jdbc:h2, jdbc:hsqldb, jdbc:mariadb, jdbc:mysql, jdbc:postgresql, jdbc:sqlite
+#
+# derby, h2, hsqldb, sqlite can be embedded,
+# If no database is available it will be created, for example the url 'jdbc:h2:./testH2' creates a new DB in OpenHab Folder.
+#
+# Create new database, for example on a MySQL-Server use:
+# CREATE DATABASE 'yourDB' CHARACTER SET utf8 COLLATE utf8_general_ci;
+
+# D A T A B A S E C O N F I G
+# Some URL-Examples, 'service' identifies and activates internally the correct jdbc driver.
+# required database url like 'jdbc:<service>:<host>[:<port>;<attributes>]'
+# url=jdbc:derby:./testDerby;create=true
+# url=jdbc:h2:./testH2
+# url=jdbc:hsqldb:./testHsqlDb
+# url=jdbc:mariadb://192.168.0.1:3306/testMariadb
+# url=jdbc:mysql://192.168.0.1:3306/testMysql
+# url=jdbc:postgresql://192.168.0.1:5432/testPostgresql
+# url=jdbc:sqlite:./testSqlite.db
+# url=
+
+# required database user
+#user=
+
+# required database password
+#password=
+
+# E R R O R H A N D L I N G
+# optional when Service is deactivated (optional, default: 0 -> ignore)
+#errReconnectThreshold=
+
+# I T E M O P E R A T I O N S
+# optional tweaking SQL datatypes
+# see: https://mybatis.github.io/mybatis-3/apidocs/reference/org/apache/ibatis/type/JdbcType.html
+# see: http://www.h2database.com/html/datatypes.html
+# see: http://www.postgresql.org/docs/9.3/static/datatype.html
+# defaults:
+#sqltype.CALL = VARCHAR(200)
+#sqltype.COLOR = VARCHAR(70)
+#sqltype.CONTACT = VARCHAR(6)
+#sqltype.DATETIME = DATETIME
+#sqltype.DIMMER = TINYINT
+#sqltype.LOCATION = VARCHAR(30)
+#sqltype.NUMBER = DOUBLE
+#sqltype.ROLLERSHUTTER = TINYINT
+#sqltype.STRING = VARCHAR(65500)
+#sqltype.SWITCH = VARCHAR(6)
+
+# For Itemtype "Number" default decimal digit count (optional, default: 3)
+#numberDecimalcount=
+
+# T A B L E O P E R A T I O N S
+# Tablename Prefix String (optional, default: "item")
+# for Migration from MYSQL-Bundle set to 'Item'.
+#tableNamePrefix=Item
+
+# Tablename Prefix generation, using Item real names or "item" (optional, default: false -> "item")
+# If true, 'tableNamePrefix' is ignored.
+#tableUseRealItemNames=
+
+# Tablename Suffix length (optional, default: 4 -> 0001-9999)
+# for Migration from MYSQL-Bundle set to 0.
+#tableIdDigitCount=
+
+# Rename existing Tables using tableUseRealItemNames and tableIdDigitCount (optional, default: false)
+# USE WITH CARE! Deactivate after Renaming is done!
+#rebuildTableNames=true
+
+# D A T A B A S E C O N N E C T I O N S
+# Some embeded Databases can handle only one Connection (optional, default: configured per database in packet org.openhab.persistence.jdbc.db.* )
+# see: https://github.com/brettwooldridge/HikariCP/issues/256
+# jdbc.maximumPoolSize = 1
+# jdbc.minimumIdle = 1
+
+# T I M E K E E P I N G
+# (optional, default: false)
+#enableLogTime=true
--- /dev/null
+# connection string url
+#url=jdbc:postgresql://<host>:5432/<databasename>
+#url=jdbc:derby://<host>:1527/<databasename>;create=true
+
+# driver class name
+#driver=org.postgresql.Driver
+#driver=org.apache.derby.jdbc.ClientDriver
+
+# username
+#user=
+
+# password
+#password=
--- /dev/null
+# the commit interval in seconds (optional, default to '5')
+#commitinterval=5
+
+# issue a commit even if the state did not change (optional, defaults to 'false')
+#commitsamestate=false
--- /dev/null
+# configure specific rrd properties for given items in this file.
+# please refer to the documentation available at
+# https://www.openhab.org/addons/persistence/rrd4j/
+#
+# default_numeric and default_other are internally defined defnames and are used as
+# defaults when no other defname applies
+
+#<dsName>.def=[ABSOLUTE|COUNTER|DERIVE|GAUGE],<heartBeat>,[<minValue>|U],[<maxValue>|U],<sampleInterval>
+#<dsName>.archives=[AVERAGE|MIN|MAX|LAST|FIRST|TOTAL],<xff>,<samplesPerBox>,<boxCount>
+#<dsName>.items=<comma separated list of items for this dsName>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.addons.features.karaf.openhab-addons</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.features.karaf</groupId>
+ <artifactId>org.openhab.addons.reactor.features.karaf</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.addons.features.karaf.openhab-addons3</artifactId>
+ <packaging>feature</packaging>
+
+ <name>openHAB Add-ons :: Features :: Karaf :: Add-ons</name>
+ <description>openHAB Add-ons Features</description>
+
+ <dependencies>
+ <dependency>
+ <groupId>${project.groupId}</groupId>
+ <artifactId>org.openhab.addons.features.karaf.openhab-addons-external3</artifactId>
+ <version>${project.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-antrun-plugin</artifactId>
+ <version>1.8</version>
+ <inherited>false</inherited>
+ <executions>
+ <execution>
+ <id>create-karaf-features</id>
+ <goals>
+ <goal>run</goal>
+ </goals>
+ <phase>generate-sources</phase>
+ <configuration>
+ <target>
+ <concat destfile="src/main/feature/feature.xml">
+ <header file="src/main/resources/header.xml" filtering="no"/>
+ <fileset dir="${basedirRoot}/bundles">
+ <include name="*/src/main/feature/feature.xml"/>
+ </fileset>
+ <filterchain>
+ <linecontainsRegExp>
+ <regexp pattern="(feature>)|(feature\s)|(bundle>)|(bundle\s)"/>
+ </linecontainsRegExp>
+ </filterchain>
+ <footer file="src/main/resources/footer.xml" filtering="no"/>
+ </concat>
+ </target>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>org.apache.karaf.tooling</groupId>
+ <artifactId>karaf-maven-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>karaf-feature-verification</id>
+ <configuration>
+ <features>
+ </features>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+
+</project>
--- /dev/null
+
+</features>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+
+ Copyright (c) 2010-2020 Contributors to the openHAB project
+
+ See the NOTICE file(s) distributed with this work for additional
+ information.
+
+ This program and the accompanying materials are made available under the
+ terms of the Eclipse Public License 2.0 which is available at
+ http://www.eclipse.org/legal/epl-2.0
+
+ SPDX-License-Identifier: EPL-2.0
+
+-->
+<features name="${project.artifactId}-${project.version}" xmlns="http://karaf.apache.org/xmlns/features/v1.4.0">
+
+ <repository>mvn:org.openhab.core.features.karaf/org.openhab.core.features.karaf.openhab-core/${ohc.version}/xml/features</repository>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons</groupId>
+ <artifactId>org.openhab.addons.reactor</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <groupId>org.openhab.addons.features.karaf</groupId>
+ <artifactId>org.openhab.addons.reactor.features.karaf</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: Features :: Karaf</name>
+
+ <properties>
+ <!-- JDBC database driver versions -->
+ <derby.version>10.12.1.1</derby.version>
+ <h2.version>1.4.191</h2.version>
+ <hsqldb.version>2.3.3</hsqldb.version>
+ <mariadb.version>1.3.5</mariadb.version>
+ <mysql.version>8.0.13</mysql.version>
+ <postgresql.version>9.4.1212</postgresql.version>
+ <sqlite.version>3.16.1</sqlite.version>
+ </properties>
+
+ <modules>
+ <module>openhab-addons</module>
+ <module>openhab-addons-external</module>
+ </modules>
+
+ <dependencies>
+ <!-- BOM, so features are build after bundles in parallel builds -->
+ <dependency>
+ <groupId>org.openhab.addons.bom</groupId>
+ <artifactId>org.openhab.addons.bom.openhab-addons</artifactId>
+ <version>${project.version}</version>
+ <type>pom</type>
+ </dependency>
+
+ <!-- Distribution -->
+ <dependency>
+ <groupId>org.apache.karaf.features</groupId>
+ <artifactId>framework</artifactId>
+ <version>${karaf.version}</version>
+ <type>kar</type>
+ <scope>provided</scope>
+ <exclusions>
+ <exclusion>
+ <!-- This should have been an optional dependency and will be fixed in Karaf 4.2.8 (KARAF-6462). -->
+ <groupId>org.knopflerfish.kf6</groupId>
+ <artifactId>log-API</artifactId>
+ </exclusion>
+ </exclusions>
+ </dependency>
+
+ <!-- Repositories -->
+ <dependency>
+ <groupId>org.apache.karaf.features</groupId>
+ <artifactId>standard</artifactId>
+ <version>${karaf.version}</version>
+ <classifier>features</classifier>
+ <type>xml</type>
+ <scope>provided</scope>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.karaf.tooling</groupId>
+ <artifactId>karaf-maven-plugin</artifactId>
+ <version>${karaf.version}</version>
+ <extensions>true</extensions>
+ <configuration>
+ <startLevel>80</startLevel>
+ <aggregateFeatures>true</aggregateFeatures>
+ <checkDependencyChange>true</checkDependencyChange>
+ <failOnDependencyChange>false</failOnDependencyChange>
+ <logDependencyChanges>true</logDependencyChanges>
+ <overwriteChangedDependencies>true</overwriteChangedDependencies>
+ </configuration>
+ <executions>
+ <execution>
+ <id>compile</id>
+ <goals>
+ <goal>features-generate-descriptor</goal>
+ </goals>
+ <phase>generate-resources</phase>
+ </execution>
+ <execution>
+ <id>karaf-feature-verification</id>
+ <goals>
+ <goal>verify</goal>
+ </goals>
+ <phase>process-resources</phase>
+ <configuration>
+ <descriptors combine.children="append">
+ <!-- Apache Karaf -->
+ <descriptor>mvn:org.apache.karaf.features/framework/${karaf.version}/xml/features</descriptor>
+ <descriptor>mvn:org.apache.karaf.features/standard/${karaf.version}/xml/features</descriptor>
+ <!-- Current feature under verification -->
+ <descriptor>file:${project.build.directory}/feature/feature.xml</descriptor>
+ </descriptors>
+ <distribution>org.apache.karaf.features:framework</distribution>
+ <javase>${oh.java.version}</javase>
+ <framework>
+ <feature>framework</feature>
+ </framework>
+ <features>
+ <feature>openhab-*</feature>
+ </features>
+ <verifyTransitive>false</verifyTransitive>
+ <ignoreMissingConditions>true</ignoreMissingConditions>
+ <fail>first</fail>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+ </build>
+
+</project>
--- /dev/null
+## Karaf features
+
+In this directory you find Karaf features defined.
+Karaf features allow to define dependencies for openHAB add-ons and OSGi bundles in general.
+
+If you require an external library in your openHAB add-on, you most likely want to edit the Karaf features file as well.
+
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.addons.reactor.itests</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+-standalone: \
+ ../../bom/runtime-index/target/index.xml;name="org.openhab.core.bom.runtime-index",\
+ ../../bom/test-index/target/index.xml;name="org.openhab.core.bom.test-index",\
+ ../../bom/openhab-core-index/target/index.xml;name="org.openhab.core.bom.openhab-core-index",\
+ target/index.xml;name="self"
+
+-resolve.effective: active
+
+-tester: biz.aQute.tester.junit-platform
+
+# Run all integration tests which are named xyzTest
+Test-Cases: ${classes;CONCRETE;PUBLIC;NAMED;*Test}
+
+# A temporary inclusion until an R7 framework is available
+Import-Package: org.osgi.framework.*;version="[1.8,2)",*
+
+# Used by Objenesis/Mockito and not actually optional
+-runsystempackages: sun.reflect
+
+-runfw: org.eclipse.osgi
+-runee: JavaSE-11
+
+# The integration test itself does not export anything.
+Export-Package:
+-exportcontents:
+
+-runrequires.ee: \
+ bnd.identity;id='org.apache.servicemix.specs.activation-api-1.1',\
+ bnd.identity;id='org.apache.servicemix.specs.jaxb-api-2.2',\
+ bnd.identity;id='org.apache.servicemix.bundles.jaxb-impl'
+
+-runrequires.junit: \
+ bnd.identity;id='biz.aQute.tester.junit-platform',\
+ bnd.identity;id='junit-jupiter-engine'
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry excluding="**" kind="src" output="target/classes" path="src/main/resources">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.binding.nest.tests</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+# Nest Binding Tests
+
+[Nest Labs](https://nest.com/) developed/acquired the Wi-Fi enabled Nest Learning Thermostat, the Nest Protect Smoke+CO detector, and the Nest Cam.
+These devices are supported by this binding, which communicates with the Nest API over a secure, RESTful API to Nest's servers.
+Monitoring ambient temperature and humidity, changing HVAC mode, changing heat or cool setpoints, monitoring and changing your "home/away" status, and monitoring your Nest Protects and Nest Cams can be accomplished through this binding.
+
+This binding is to test and verify the nest binding.
--- /dev/null
+-include: ../itest-common.bndrun
+
+Bundle-SymbolicName: ${project.artifactId}
+Fragment-Host: org.openhab.binding.nest
+
+-runrequires: \
+ bnd.identity;id='org.openhab.binding.nest.tests',\
+ bnd.identity;id='org.openhab.core.binding.xml',\
+ bnd.identity;id='org.openhab.core.thing.xml'
+
+# We would like to use the "volatile" storage only
+-runblacklist: \
+ bnd.identity;id='org.openhab.core.storage.json'
+
+-runproperties: logback.configurationFile=file:${.}/logback.xml
+
+#
+# done
+#
+-runbundles: \
+ ch.qos.logback.classic;version='[1.2.3,1.2.4)',\
+ ch.qos.logback.core;version='[1.2.3,1.2.4)',\
+ com.google.gson;version='[2.8.2,2.8.3)',\
+ javax.measure.unit-api;version='[1.0.0,1.0.1)',\
+ javax.xml.soap-api;version='[1.3.5,1.3.6)',\
+ org.apache.aries.javax.jax.rs-api;version='[1.0.0,1.0.1)',\
+ org.apache.felix.configadmin;version='[1.9.8,1.9.9)',\
+ org.apache.felix.http.servlet-api;version='[1.1.2,1.1.3)',\
+ org.apache.felix.scr;version='[2.1.10,2.1.11)',\
+ org.apache.servicemix.bundles.jaxb-impl;version='[2.2.11,2.2.12)',\
+ org.apache.servicemix.bundles.xstream;version='[1.4.7,1.4.8)',\
+ org.apache.servicemix.specs.activation-api-1.1;version='[2.9.0,2.9.1)',\
+ org.apache.servicemix.specs.annotation-api-1.3;version='[1.3.0,1.3.1)',\
+ org.apache.servicemix.specs.jaxb-api-2.2;version='[2.9.0,2.9.1)',\
+ org.apache.servicemix.specs.jaxws-api-2.2;version='[2.9.0,2.9.1)',\
+ org.apache.servicemix.specs.stax-api-1.2;version='[2.9.0,2.9.1)',\
+ org.apache.xbean.bundleutils;version='[4.12.0,4.12.1)',\
+ org.apache.xbean.finder;version='[4.12.0,4.12.1)',\
+ org.eclipse.equinox.event;version='[1.4.300,1.4.301)',\
+ org.eclipse.jetty.client;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.http;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.io;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.security;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.server;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.servlet;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.util;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.websocket.api;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.websocket.client;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.websocket.common;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.xml;version='[9.4.20,9.4.21)',\
+ org.objectweb.asm.commons;version='[7.1.0,7.1.1)',\
+ org.objectweb.asm.tree;version='[7.1.0,7.1.1)',\
+ org.objectweb.asm;version='[7.1.0,7.1.1)',\
+ org.objenesis;version='[2.6.0,2.6.1)',\
+ org.openhab.binding.nest.tests;version='[3.0.0,3.0.1)',\
+ org.openhab.binding.nest;version='[3.0.0,3.0.1)',\
+ org.openhab.core.binding.xml;version='[3.0.0,3.0.1)',\
+ org.openhab.core.config.core;version='[3.0.0,3.0.1)',\
+ org.openhab.core.config.discovery;version='[3.0.0,3.0.1)',\
+ org.openhab.core.config.xml;version='[3.0.0,3.0.1)',\
+ org.openhab.core.io.console;version='[3.0.0,3.0.1)',\
+ org.openhab.core.io.net;version='[3.0.0,3.0.1)',\
+ org.openhab.core.test;version='[3.0.0,3.0.1)',\
+ org.openhab.core.thing;version='[3.0.0,3.0.1)',\
+ org.openhab.core.thing.xml;version='[3.0.0,3.0.1)',\
+ org.openhab.core;version='[3.0.0,3.0.1)',\
+ org.ops4j.pax.swissbox.optional.jcl;version='[1.8.3,1.8.4)',\
+ org.ops4j.pax.web.pax-web-api;version='[7.2.11,7.2.12)',\
+ org.ops4j.pax.web.pax-web-jetty;version='[7.2.11,7.2.12)',\
+ org.ops4j.pax.web.pax-web-runtime;version='[7.2.11,7.2.12)',\
+ org.ops4j.pax.web.pax-web-spi;version='[7.2.11,7.2.12)',\
+ org.osgi.service.event;version='[1.4.0,1.4.1)',\
+ org.osgi.service.jaxrs;version='[1.0.0,1.0.1)',\
+ org.osgi.util.function;version='[1.1.0,1.1.1)',\
+ org.osgi.util.promise;version='[1.1.0,1.1.1)',\
+ slf4j.api;version='[1.7.25,1.7.26)',\
+ tec.uom.lib.uom-lib-common;version='[1.0.3,1.0.4)',\
+ tec.uom.se;version='[1.0.10,1.0.11)',\
+ biz.aQute.tester.junit-platform;version='[5.1.2,5.1.3)',\
+ junit-jupiter-api;version='[5.6.2,5.6.3)',\
+ junit-jupiter-engine;version='[5.6.2,5.6.3)',\
+ junit-platform-commons;version='[1.6.2,1.6.3)',\
+ junit-platform-engine;version='[1.6.2,1.6.3)',\
+ junit-platform-launcher;version='[1.6.2,1.6.3)',\
+ net.bytebuddy.byte-buddy;version='[1.10.13,1.10.14)',\
+ net.bytebuddy.byte-buddy-agent;version='[1.10.13,1.10.14)',\
+ org.hamcrest;version='[2.2.0,2.2.1)',\
+ org.mockito.mockito-core;version='[3.4.6,3.4.7)',\
+ org.opentest4j;version='[1.2.0,1.2.1)',\
+ jakarta.xml.bind-api;version='[2.3.3,2.3.4)',\
+ org.apache.aries.jax.rs.whiteboard;version='[1.0.9,1.0.10)'
--- /dev/null
+<configuration>
+
+ <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
+ <encoder>
+ <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
+ </encoder>
+ </appender>
+
+ <logger name="org.eclipse.jetty" level="info">
+ <appender-ref ref="STDOUT" />
+ </logger>
+
+ <root level="debug">
+ <appender-ref ref="STDOUT" />
+ </root>
+</configuration>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.itests</groupId>
+ <artifactId>org.openhab.addons.reactor.itests</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.binding.nest.tests</artifactId>
+
+ <name>openHAB Add-ons :: Integration Tests :: Nest Binding Tests</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.binding.nest</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ </dependencies>
+
+</project>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.handler;
+
+import static org.hamcrest.CoreMatchers.*;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.mockito.ArgumentMatchers.eq;
+import static org.mockito.Mockito.*;
+import static org.mockito.MockitoAnnotations.openMocks;
+
+import javax.ws.rs.client.ClientBuilder;
+
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.mockito.ArgumentCaptor;
+import org.mockito.Mock;
+import org.openhab.binding.nest.internal.config.NestBridgeConfiguration;
+import org.openhab.binding.nest.internal.handler.NestBridgeHandler;
+import org.openhab.binding.nest.internal.handler.NestRedirectUrlSupplier;
+import org.openhab.binding.nest.test.NestTestBridgeHandler;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusInfo;
+import org.openhab.core.thing.binding.ThingHandler;
+import org.openhab.core.thing.binding.ThingHandlerCallback;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+
+/**
+ * Tests cases for {@link NestBridgeHandler}.
+ *
+ * @author David Bennett - Initial contribution
+ */
+public class NestBridgeHandlerTest {
+
+ private ThingHandler handler;
+
+ private AutoCloseable mocksCloseable;
+
+ private @Mock Bridge bridge;
+ private @Mock ThingHandlerCallback callback;
+ private @Mock ClientBuilder clientBuilder;
+ private @Mock Configuration configuration;
+ private @Mock SseEventSourceFactory eventSourceFactory;
+ private @Mock NestRedirectUrlSupplier redirectUrlSupplier;
+
+ @BeforeEach
+ public void beforeEach() {
+ mocksCloseable = openMocks(this);
+ handler = new NestTestBridgeHandler(bridge, clientBuilder, eventSourceFactory, "http://localhost");
+ handler.setCallback(callback);
+ }
+
+ @AfterEach
+ public void afterEach() throws Exception {
+ mocksCloseable.close();
+ }
+
+ @SuppressWarnings("null")
+ @Test
+ public void initializeShouldCallTheCallback() {
+ when(bridge.getConfiguration()).thenReturn(configuration);
+ NestBridgeConfiguration bridgeConfig = new NestBridgeConfiguration();
+ when(configuration.as(eq(NestBridgeConfiguration.class))).thenReturn(bridgeConfig);
+ bridgeConfig.accessToken = "my token";
+
+ // we expect the handler#initialize method to call the callback during execution and
+ // pass it the thing and a ThingStatusInfo object containing the ThingStatus of the thing.
+ handler.initialize();
+
+ // the argument captor will capture the argument of type ThingStatusInfo given to the
+ // callback#statusUpdated method.
+ ArgumentCaptor<ThingStatusInfo> statusInfoCaptor = ArgumentCaptor.forClass(ThingStatusInfo.class);
+
+ // verify the interaction with the callback and capture the ThingStatusInfo argument:
+ verify(callback).statusUpdated(eq(bridge), statusInfoCaptor.capture());
+ // assert that the ThingStatusInfo given to the callback was build with the UNKNOWN status:
+ ThingStatusInfo thingStatusInfo = statusInfoCaptor.getValue();
+ assertThat(thingStatusInfo.getStatus(), is(equalTo(ThingStatus.UNKNOWN)));
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.handler;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.core.Is.is;
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.binding.nest.internal.data.NestDataUtil.*;
+import static org.openhab.core.library.types.OnOffType.*;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.junit.jupiter.api.Test;
+import org.openhab.binding.nest.internal.config.NestDeviceConfiguration;
+import org.openhab.binding.nest.internal.handler.NestCameraHandler;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusDetail;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.builder.ThingBuilder;
+
+/**
+ * Tests for {@link NestCameraHandler}.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestCameraHandlerTest extends NestThingHandlerOSGiTest {
+
+ private static final ThingUID CAMERA_UID = new ThingUID(THING_TYPE_CAMERA, "camera1");
+ private static final int CHANNEL_COUNT = 20;
+
+ public NestCameraHandlerTest() {
+ super(NestCameraHandler.class);
+ }
+
+ @Override
+ protected Thing buildThing(Bridge bridge) {
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestDeviceConfiguration.DEVICE_ID, CAMERA1_DEVICE_ID);
+
+ return ThingBuilder.create(THING_TYPE_CAMERA, CAMERA_UID).withLabel("Test Camera").withBridge(bridge.getUID())
+ .withChannels(buildChannels(THING_TYPE_CAMERA, CAMERA_UID))
+ .withConfiguration(new Configuration(properties)).build();
+ }
+
+ @Test
+ public void completeCameraUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ // Camera channel group
+ assertThatItemHasState(CHANNEL_CAMERA_APP_URL, new StringType("https://camera_app_url"));
+ assertThatItemHasState(CHANNEL_CAMERA_AUDIO_INPUT_ENABLED, ON);
+ assertThatItemHasState(CHANNEL_CAMERA_LAST_ONLINE_CHANGE, parseDateTimeType("2017-01-22T08:19:20.000Z"));
+ assertThatItemHasState(CHANNEL_CAMERA_PUBLIC_SHARE_ENABLED, OFF);
+ assertThatItemHasState(CHANNEL_CAMERA_PUBLIC_SHARE_URL, new StringType("https://camera_public_share_url"));
+ assertThatItemHasState(CHANNEL_CAMERA_SNAPSHOT_URL, new StringType("https://camera_snapshot_url"));
+ assertThatItemHasState(CHANNEL_CAMERA_STREAMING, OFF);
+ assertThatItemHasState(CHANNEL_CAMERA_VIDEO_HISTORY_ENABLED, OFF);
+ assertThatItemHasState(CHANNEL_CAMERA_WEB_URL, new StringType("https://camera_web_url"));
+
+ // Last event channel group
+ assertThatItemHasState(CHANNEL_LAST_EVENT_ACTIVITY_ZONES, new StringType("id1,id2"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_ANIMATED_IMAGE_URL,
+ new StringType("https://last_event_animated_image_url"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_APP_URL, new StringType("https://last_event_app_url"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_END_TIME, parseDateTimeType("2017-01-22T07:40:38.680Z"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_HAS_MOTION, ON);
+ assertThatItemHasState(CHANNEL_LAST_EVENT_HAS_PERSON, OFF);
+ assertThatItemHasState(CHANNEL_LAST_EVENT_HAS_SOUND, OFF);
+ assertThatItemHasState(CHANNEL_LAST_EVENT_IMAGE_URL, new StringType("https://last_event_image_url"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_START_TIME, parseDateTimeType("2017-01-22T07:40:19.020Z"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_URLS_EXPIRE_TIME, parseDateTimeType("2017-02-05T07:40:19.020Z"));
+ assertThatItemHasState(CHANNEL_LAST_EVENT_WEB_URL, new StringType("https://last_event_web_url"));
+
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void incompleteCameraUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ putStreamingEventData(fromFile(INCOMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.UNKNOWN)));
+ assertThatAllItemStatesAreNull();
+ }
+
+ @Test
+ public void cameraGone() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ putStreamingEventData(fromFile(EMPTY_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.OFFLINE)));
+ assertThat(thing.getStatusInfo().getStatusDetail(), is(ThingStatusDetail.GONE));
+ }
+
+ @Test
+ public void channelRefresh() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ updateAllItemStatesToNull();
+ assertThatAllItemStatesAreNull();
+
+ refreshAllChannels();
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void handleStreamingCommands() throws IOException {
+ handleCommand(CHANNEL_CAMERA_STREAMING, ON);
+ assertNestApiPropertyState(CAMERA1_DEVICE_ID, "is_streaming", "true");
+
+ handleCommand(CHANNEL_CAMERA_STREAMING, OFF);
+ assertNestApiPropertyState(CAMERA1_DEVICE_ID, "is_streaming", "false");
+
+ handleCommand(CHANNEL_CAMERA_STREAMING, ON);
+ assertNestApiPropertyState(CAMERA1_DEVICE_ID, "is_streaming", "true");
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.handler;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.core.Is.is;
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.binding.nest.internal.data.NestDataUtil.*;
+import static org.openhab.core.library.types.OnOffType.OFF;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.junit.jupiter.api.Test;
+import org.openhab.binding.nest.internal.config.NestDeviceConfiguration;
+import org.openhab.binding.nest.internal.handler.NestSmokeDetectorHandler;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusDetail;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.builder.ThingBuilder;
+
+/**
+ * Tests for {@link NestSmokeDetectorHandler}.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestSmokeDetectorHandlerTest extends NestThingHandlerOSGiTest {
+
+ private static final ThingUID SMOKE_DETECTOR_UID = new ThingUID(THING_TYPE_SMOKE_DETECTOR, "smoke1");
+ private static final int CHANNEL_COUNT = 7;
+
+ public NestSmokeDetectorHandlerTest() {
+ super(NestSmokeDetectorHandler.class);
+ }
+
+ @Override
+ protected Thing buildThing(Bridge bridge) {
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestDeviceConfiguration.DEVICE_ID, SMOKE1_DEVICE_ID);
+
+ return ThingBuilder.create(THING_TYPE_SMOKE_DETECTOR, SMOKE_DETECTOR_UID).withLabel("Test Smoke Detector")
+ .withBridge(bridge.getUID()).withChannels(buildChannels(THING_TYPE_SMOKE_DETECTOR, SMOKE_DETECTOR_UID))
+ .withConfiguration(new Configuration(properties)).build();
+ }
+
+ @Test
+ public void completeSmokeDetectorUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ assertThatItemHasState(CHANNEL_CO_ALARM_STATE, new StringType("OK"));
+ assertThatItemHasState(CHANNEL_LAST_CONNECTION, parseDateTimeType("2017-02-02T20:53:05.338Z"));
+ assertThatItemHasState(CHANNEL_LAST_MANUAL_TEST_TIME, parseDateTimeType("2016-10-31T23:59:59.000Z"));
+ assertThatItemHasState(CHANNEL_LOW_BATTERY, OFF);
+ assertThatItemHasState(CHANNEL_MANUAL_TEST_ACTIVE, OFF);
+ assertThatItemHasState(CHANNEL_SMOKE_ALARM_STATE, new StringType("OK"));
+ assertThatItemHasState(CHANNEL_UI_COLOR_STATE, new StringType("GREEN"));
+
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void incompleteSmokeDetectorUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ putStreamingEventData(fromFile(INCOMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.UNKNOWN)));
+ assertThatAllItemStatesAreNull();
+ }
+
+ @Test
+ public void smokeDetectorGone() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ putStreamingEventData(fromFile(EMPTY_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.OFFLINE)));
+ assertThat(thing.getStatusInfo().getStatusDetail(), is(ThingStatusDetail.GONE));
+ }
+
+ @Test
+ public void channelRefresh() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ updateAllItemStatesToNull();
+ assertThatAllItemStatesAreNull();
+
+ refreshAllChannels();
+ assertThatAllItemStatesAreNotNull();
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.handler;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.core.Is.is;
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.binding.nest.internal.data.NestDataUtil.*;
+import static org.openhab.core.library.types.OnOffType.OFF;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.junit.jupiter.api.Test;
+import org.openhab.binding.nest.internal.config.NestStructureConfiguration;
+import org.openhab.binding.nest.internal.handler.NestStructureHandler;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusDetail;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.builder.ThingBuilder;
+
+/**
+ * Tests for {@link NestStructureHandler}.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestStructureHandlerTest extends NestThingHandlerOSGiTest {
+
+ private static final ThingUID STRUCTURE_UID = new ThingUID(THING_TYPE_STRUCTURE, "structure1");
+ private static final int CHANNEL_COUNT = 11;
+
+ public NestStructureHandlerTest() {
+ super(NestStructureHandler.class);
+ }
+
+ @Override
+ protected Thing buildThing(Bridge bridge) {
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestStructureConfiguration.STRUCTURE_ID, STRUCTURE1_STRUCTURE_ID);
+
+ return ThingBuilder.create(THING_TYPE_STRUCTURE, STRUCTURE_UID).withLabel("Test Structure")
+ .withBridge(bridge.getUID()).withChannels(buildChannels(THING_TYPE_STRUCTURE, STRUCTURE_UID))
+ .withConfiguration(new Configuration(properties)).build();
+ }
+
+ @Test
+ public void completeStructureUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ assertThatItemHasState(CHANNEL_AWAY, new StringType("HOME"));
+ assertThatItemHasState(CHANNEL_CO_ALARM_STATE, new StringType("OK"));
+ assertThatItemHasState(CHANNEL_COUNTRY_CODE, new StringType("US"));
+ assertThatItemHasState(CHANNEL_ETA_BEGIN, parseDateTimeType("2017-02-02T03:10:08.000Z"));
+ assertThatItemHasState(CHANNEL_PEAK_PERIOD_END_TIME, parseDateTimeType("2017-07-01T01:03:08.400Z"));
+ assertThatItemHasState(CHANNEL_PEAK_PERIOD_START_TIME, parseDateTimeType("2017-06-01T13:31:10.870Z"));
+ assertThatItemHasState(CHANNEL_POSTAL_CODE, new StringType("98056"));
+ assertThatItemHasState(CHANNEL_RUSH_HOUR_REWARDS_ENROLLMENT, OFF);
+ assertThatItemHasState(CHANNEL_SECURITY_STATE, new StringType("OK"));
+ assertThatItemHasState(CHANNEL_SMOKE_ALARM_STATE, new StringType("OK"));
+ assertThatItemHasState(CHANNEL_TIME_ZONE, new StringType("America/Los_Angeles"));
+
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void incompleteStructureUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ putStreamingEventData(fromFile(INCOMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNull();
+ }
+
+ @Test
+ public void structureGone() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ putStreamingEventData(fromFile(EMPTY_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.OFFLINE)));
+ assertThat(thing.getStatusInfo().getStatusDetail(), is(ThingStatusDetail.GONE));
+ }
+
+ @Test
+ public void channelRefresh() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ updateAllItemStatesToNull();
+ assertThatAllItemStatesAreNull();
+
+ refreshAllChannels();
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void handleAwayCommands() throws IOException {
+ handleCommand(CHANNEL_AWAY, new StringType("AWAY"));
+ assertNestApiPropertyState(STRUCTURE1_STRUCTURE_ID, "away", "away");
+
+ handleCommand(CHANNEL_AWAY, new StringType("HOME"));
+ assertNestApiPropertyState(STRUCTURE1_STRUCTURE_ID, "away", "home");
+
+ handleCommand(CHANNEL_AWAY, new StringType("AWAY"));
+ assertNestApiPropertyState(STRUCTURE1_STRUCTURE_ID, "away", "away");
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.handler;
+
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.core.Is.is;
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.binding.nest.internal.data.NestDataUtil.*;
+import static org.openhab.core.library.types.OnOffType.*;
+import static org.openhab.core.library.unit.ImperialUnits.FAHRENHEIT;
+import static org.openhab.core.library.unit.SIUnits.CELSIUS;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.junit.jupiter.api.Test;
+import org.openhab.binding.nest.internal.config.NestDeviceConfiguration;
+import org.openhab.binding.nest.internal.handler.NestThermostatHandler;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.library.types.QuantityType;
+import org.openhab.core.library.types.StringType;
+import org.openhab.core.library.unit.SmartHomeUnits;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingStatus;
+import org.openhab.core.thing.ThingStatusDetail;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.builder.ThingBuilder;
+
+/**
+ * Tests for {@link NestThermostatHandler}.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestThermostatHandlerTest extends NestThingHandlerOSGiTest {
+
+ private static final ThingUID THERMOSTAT_UID = new ThingUID(THING_TYPE_THERMOSTAT, "thermostat1");
+ private static final int CHANNEL_COUNT = 25;
+
+ public NestThermostatHandlerTest() {
+ super(NestThermostatHandler.class);
+ }
+
+ @Override
+ protected Thing buildThing(Bridge bridge) {
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestDeviceConfiguration.DEVICE_ID, THERMOSTAT1_DEVICE_ID);
+
+ return ThingBuilder.create(THING_TYPE_THERMOSTAT, THERMOSTAT_UID).withLabel("Test Thermostat")
+ .withBridge(bridge.getUID()).withChannels(buildChannels(THING_TYPE_THERMOSTAT, THERMOSTAT_UID))
+ .withConfiguration(new Configuration(properties)).build();
+ }
+
+ @Test
+ public void completeThermostatCelsiusUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME, CELSIUS));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ assertThatItemHasState(CHANNEL_CAN_COOL, OFF);
+ assertThatItemHasState(CHANNEL_CAN_HEAT, ON);
+ assertThatItemHasState(CHANNEL_ECO_MAX_SET_POINT, new QuantityType<>(24, CELSIUS));
+ assertThatItemHasState(CHANNEL_ECO_MIN_SET_POINT, new QuantityType<>(12.5, CELSIUS));
+ assertThatItemHasState(CHANNEL_FAN_TIMER_ACTIVE, OFF);
+ assertThatItemHasState(CHANNEL_FAN_TIMER_DURATION, new QuantityType<>(15, SmartHomeUnits.MINUTE));
+ assertThatItemHasState(CHANNEL_FAN_TIMER_TIMEOUT, parseDateTimeType("1970-01-01T00:00:00.000Z"));
+ assertThatItemHasState(CHANNEL_HAS_FAN, ON);
+ assertThatItemHasState(CHANNEL_HAS_LEAF, ON);
+ assertThatItemHasState(CHANNEL_HUMIDITY, new QuantityType<>(25, SmartHomeUnits.PERCENT));
+ assertThatItemHasState(CHANNEL_LAST_CONNECTION, parseDateTimeType("2017-02-02T21:00:06.000Z"));
+ assertThatItemHasState(CHANNEL_LOCKED, OFF);
+ assertThatItemHasState(CHANNEL_LOCKED_MAX_SET_POINT, new QuantityType<>(22, CELSIUS));
+ assertThatItemHasState(CHANNEL_LOCKED_MIN_SET_POINT, new QuantityType<>(20, CELSIUS));
+ assertThatItemHasState(CHANNEL_MAX_SET_POINT, new QuantityType<>(24, CELSIUS));
+ assertThatItemHasState(CHANNEL_MIN_SET_POINT, new QuantityType<>(20, CELSIUS));
+ assertThatItemHasState(CHANNEL_MODE, new StringType("HEAT"));
+ assertThatItemHasState(CHANNEL_PREVIOUS_MODE, new StringType("HEAT"));
+ assertThatItemHasState(CHANNEL_SET_POINT, new QuantityType<>(15.5, CELSIUS));
+ assertThatItemHasState(CHANNEL_STATE, new StringType("OFF"));
+ assertThatItemHasState(CHANNEL_SUNLIGHT_CORRECTION_ACTIVE, OFF);
+ assertThatItemHasState(CHANNEL_SUNLIGHT_CORRECTION_ENABLED, ON);
+ assertThatItemHasState(CHANNEL_TEMPERATURE, new QuantityType<>(19, CELSIUS));
+ assertThatItemHasState(CHANNEL_TIME_TO_TARGET, new QuantityType<>(0, SmartHomeUnits.MINUTE));
+ assertThatItemHasState(CHANNEL_USING_EMERGENCY_HEAT, OFF);
+
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void completeThermostatFahrenheitUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME, FAHRENHEIT));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ assertThatItemHasState(CHANNEL_CAN_COOL, OFF);
+ assertThatItemHasState(CHANNEL_CAN_HEAT, ON);
+ assertThatItemHasState(CHANNEL_ECO_MAX_SET_POINT, new QuantityType<>(76, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_ECO_MIN_SET_POINT, new QuantityType<>(55, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_FAN_TIMER_ACTIVE, OFF);
+ assertThatItemHasState(CHANNEL_FAN_TIMER_DURATION, new QuantityType<>(15, SmartHomeUnits.MINUTE));
+ assertThatItemHasState(CHANNEL_FAN_TIMER_TIMEOUT, parseDateTimeType("1970-01-01T00:00:00.000Z"));
+ assertThatItemHasState(CHANNEL_HAS_FAN, ON);
+ assertThatItemHasState(CHANNEL_HAS_LEAF, ON);
+ assertThatItemHasState(CHANNEL_HUMIDITY, new QuantityType<>(25, SmartHomeUnits.PERCENT));
+ assertThatItemHasState(CHANNEL_LAST_CONNECTION, parseDateTimeType("2017-02-02T21:00:06.000Z"));
+ assertThatItemHasState(CHANNEL_LOCKED, OFF);
+ assertThatItemHasState(CHANNEL_LOCKED_MAX_SET_POINT, new QuantityType<>(72, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_LOCKED_MIN_SET_POINT, new QuantityType<>(68, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_MAX_SET_POINT, new QuantityType<>(75, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_MIN_SET_POINT, new QuantityType<>(68, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_MODE, new StringType("HEAT"));
+ assertThatItemHasState(CHANNEL_PREVIOUS_MODE, new StringType("HEAT"));
+ assertThatItemHasState(CHANNEL_SET_POINT, new QuantityType<>(60, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_STATE, new StringType("OFF"));
+ assertThatItemHasState(CHANNEL_SUNLIGHT_CORRECTION_ACTIVE, OFF);
+ assertThatItemHasState(CHANNEL_SUNLIGHT_CORRECTION_ENABLED, ON);
+ assertThatItemHasState(CHANNEL_TEMPERATURE, new QuantityType<>(66, FAHRENHEIT));
+ assertThatItemHasState(CHANNEL_TIME_TO_TARGET, new QuantityType<>(0, SmartHomeUnits.MINUTE));
+ assertThatItemHasState(CHANNEL_USING_EMERGENCY_HEAT, OFF);
+
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void incompleteThermostatUpdate() throws IOException {
+ assertThat(thing.getChannels().size(), is(CHANNEL_COUNT));
+ assertThat(thing.getStatus(), is(ThingStatus.OFFLINE));
+
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ putStreamingEventData(fromFile(INCOMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.UNKNOWN)));
+ assertThatAllItemStatesAreNull();
+ }
+
+ @Test
+ public void thermostatGone() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ putStreamingEventData(fromFile(EMPTY_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.OFFLINE)));
+ assertThat(thing.getStatusInfo().getStatusDetail(), is(ThingStatusDetail.GONE));
+ }
+
+ @Test
+ public void channelRefresh() throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+ assertThatAllItemStatesAreNotNull();
+
+ updateAllItemStatesToNull();
+ assertThatAllItemStatesAreNull();
+
+ refreshAllChannels();
+ assertThatAllItemStatesAreNotNull();
+ }
+
+ @Test
+ public void handleFanTimerActiveCommands() throws IOException {
+ handleCommand(CHANNEL_FAN_TIMER_ACTIVE, ON);
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "fan_timer_active", "true");
+
+ handleCommand(CHANNEL_FAN_TIMER_ACTIVE, OFF);
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "fan_timer_active", "false");
+
+ handleCommand(CHANNEL_FAN_TIMER_ACTIVE, ON);
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "fan_timer_active", "true");
+ }
+
+ @Test
+ public void handleFanTimerDurationCommands() throws IOException {
+ int[] durations = { 15, 30, 45, 60, 120, 240, 480, 960, 15 };
+ for (int duration : durations) {
+ handleCommand(CHANNEL_FAN_TIMER_DURATION, new QuantityType<>(duration, SmartHomeUnits.MINUTE));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "fan_timer_duration", String.valueOf(duration));
+ }
+ }
+
+ @Test
+ public void handleMaxSetPointCelsiusCommands() throws IOException {
+ celsiusCommandsTest(CHANNEL_MAX_SET_POINT, "target_temperature_high_c");
+ }
+
+ @Test
+ public void handleMaxSetPointFahrenheitCommands() throws IOException {
+ fahrenheitCommandsTest(CHANNEL_MAX_SET_POINT, "target_temperature_high_f");
+ }
+
+ @Test
+ public void handleMinSetPointCelsiusCommands() throws IOException {
+ celsiusCommandsTest(CHANNEL_MIN_SET_POINT, "target_temperature_low_c");
+ }
+
+ @Test
+ public void handleMinSetPointFahrenheitCommands() throws IOException {
+ fahrenheitCommandsTest(CHANNEL_MIN_SET_POINT, "target_temperature_low_f");
+ }
+
+ @Test
+ public void handleChannelModeCommands() throws IOException {
+ handleCommand(CHANNEL_MODE, new StringType("HEAT"));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "hvac_mode", "heat");
+
+ handleCommand(CHANNEL_MODE, new StringType("COOL"));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "hvac_mode", "cool");
+
+ handleCommand(CHANNEL_MODE, new StringType("HEAT_COOL"));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "hvac_mode", "heat-cool");
+
+ handleCommand(CHANNEL_MODE, new StringType("ECO"));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "hvac_mode", "eco");
+
+ handleCommand(CHANNEL_MODE, new StringType("OFF"));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "hvac_mode", "off");
+
+ handleCommand(CHANNEL_MODE, new StringType("HEAT"));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, "hvac_mode", "heat");
+ }
+
+ @Test
+ public void handleSetPointCelsiusCommands() throws IOException {
+ celsiusCommandsTest(CHANNEL_SET_POINT, "target_temperature_c");
+ }
+
+ @Test
+ public void handleSetPointFahrenheitCommands() throws IOException {
+ fahrenheitCommandsTest(CHANNEL_SET_POINT, "target_temperature_f");
+ }
+
+ private void celsiusCommandsTest(String channelId, String apiPropertyName) throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME, CELSIUS));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ handleCommand(channelId, new QuantityType<>(20, CELSIUS));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "20.0");
+
+ handleCommand(channelId, new QuantityType<>(21.123, CELSIUS));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "21.0");
+
+ handleCommand(channelId, new QuantityType<>(22.541, CELSIUS));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "22.5");
+
+ handleCommand(channelId, new QuantityType<>(23.74, CELSIUS));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "23.5");
+
+ handleCommand(channelId, new QuantityType<>(23.75, CELSIUS));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "24.0");
+
+ handleCommand(channelId, new QuantityType<>(70, FAHRENHEIT));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "21.0");
+ }
+
+ private void fahrenheitCommandsTest(String channelId, String apiPropertyName) throws IOException {
+ waitForAssert(() -> assertThat(bridge.getStatus(), is(ThingStatus.ONLINE)));
+ putStreamingEventData(fromFile(COMPLETE_DATA_FILE_NAME, FAHRENHEIT));
+ waitForAssert(() -> assertThat(thing.getStatus(), is(ThingStatus.ONLINE)));
+
+ handleCommand(channelId, new QuantityType<>(70, FAHRENHEIT));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "70");
+
+ handleCommand(channelId, new QuantityType<>(71.123, FAHRENHEIT));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "71");
+
+ handleCommand(channelId, new QuantityType<>(71.541, FAHRENHEIT));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "72");
+
+ handleCommand(channelId, new QuantityType<>(72.74, FAHRENHEIT));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "73");
+
+ handleCommand(channelId, new QuantityType<>(73.75, FAHRENHEIT));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "74");
+
+ handleCommand(channelId, new QuantityType<>(21, CELSIUS));
+ assertNestApiPropertyState(THERMOSTAT1_DEVICE_ID, apiPropertyName, "70");
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.handler;
+
+import static org.hamcrest.CoreMatchers.*;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.core.Is.is;
+import static org.hamcrest.core.IsNot.not;
+import static org.mockito.Mockito.*;
+import static org.openhab.binding.nest.internal.rest.NestStreamingRestClient.PUT;
+
+import java.io.IOException;
+import java.time.Instant;
+import java.time.format.DateTimeParseException;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.TimeZone;
+import java.util.function.Function;
+
+import javax.ws.rs.client.ClientBuilder;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jetty.servlet.ServletHolder;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeAll;
+import org.junit.jupiter.api.BeforeEach;
+import org.openhab.binding.nest.internal.config.NestBridgeConfiguration;
+import org.openhab.binding.nest.internal.handler.NestBaseHandler;
+import org.openhab.binding.nest.test.NestTestApiServlet;
+import org.openhab.binding.nest.test.NestTestBridgeHandler;
+import org.openhab.binding.nest.test.NestTestHandlerFactory;
+import org.openhab.binding.nest.test.NestTestServer;
+import org.openhab.core.config.core.Configuration;
+import org.openhab.core.events.EventPublisher;
+import org.openhab.core.items.Item;
+import org.openhab.core.items.ItemFactory;
+import org.openhab.core.items.ItemNotFoundException;
+import org.openhab.core.items.ItemRegistry;
+import org.openhab.core.items.events.ItemEventFactory;
+import org.openhab.core.library.types.DateTimeType;
+import org.openhab.core.test.TestPortUtil;
+import org.openhab.core.test.java.JavaOSGiTest;
+import org.openhab.core.test.storage.VolatileStorageService;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Channel;
+import org.openhab.core.thing.ChannelUID;
+import org.openhab.core.thing.ManagedThingProvider;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingProvider;
+import org.openhab.core.thing.ThingTypeUID;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.ThingHandlerFactory;
+import org.openhab.core.thing.binding.builder.BridgeBuilder;
+import org.openhab.core.thing.binding.builder.ChannelBuilder;
+import org.openhab.core.thing.link.ItemChannelLink;
+import org.openhab.core.thing.link.ManagedItemChannelLinkProvider;
+import org.openhab.core.thing.type.ChannelDefinition;
+import org.openhab.core.thing.type.ChannelGroupDefinition;
+import org.openhab.core.thing.type.ChannelGroupType;
+import org.openhab.core.thing.type.ChannelGroupTypeRegistry;
+import org.openhab.core.thing.type.ChannelType;
+import org.openhab.core.thing.type.ChannelTypeRegistry;
+import org.openhab.core.thing.type.ThingType;
+import org.openhab.core.thing.type.ThingTypeRegistry;
+import org.openhab.core.types.Command;
+import org.openhab.core.types.RefreshType;
+import org.openhab.core.types.State;
+import org.openhab.core.types.UnDefType;
+import org.osgi.service.component.ComponentContext;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * {@link NestThingHandlerOSGiTest} is an abstract base class for Nest OSGi based tests.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public abstract class NestThingHandlerOSGiTest extends JavaOSGiTest {
+
+ private static final String SERVER_HOST = "127.0.0.1";
+ private static final int SERVER_PORT = TestPortUtil.findFreePort();
+ private static final int SERVER_TIMEOUT = -1;
+ private static final String REDIRECT_URL = "http://" + SERVER_HOST + ":" + SERVER_PORT;
+
+ private final Logger logger = LoggerFactory.getLogger(NestThingHandlerOSGiTest.class);
+
+ private static NestTestServer server;
+ private static NestTestApiServlet servlet = new NestTestApiServlet();
+
+ private ChannelTypeRegistry channelTypeRegistry;
+ private ChannelGroupTypeRegistry channelGroupTypeRegistry;
+ private ItemFactory itemFactory;
+ private ItemRegistry itemRegistry;
+ private EventPublisher eventPublisher;
+ private ManagedThingProvider managedThingProvider;
+ private ThingTypeRegistry thingTypeRegistry;
+ private ManagedItemChannelLinkProvider managedItemChannelLinkProvider;
+ private VolatileStorageService volatileStorageService = new VolatileStorageService();
+
+ protected Bridge bridge;
+ protected NestTestBridgeHandler bridgeHandler;
+ protected Thing thing;
+ protected NestBaseHandler<?> thingHandler;
+ private Class<? extends NestBaseHandler<?>> thingClass;
+
+ private NestTestHandlerFactory nestTestHandlerFactory;
+ private @NonNullByDefault({}) ClientBuilder clientBuilder;
+ private @NonNullByDefault({}) SseEventSourceFactory eventSourceFactory;
+
+ public NestThingHandlerOSGiTest(Class<? extends NestBaseHandler<?>> thingClass) {
+ this.thingClass = thingClass;
+ }
+
+ @BeforeAll
+ public static void setUpClass() throws Exception {
+ ServletHolder holder = new ServletHolder(servlet);
+ server = new NestTestServer(SERVER_HOST, SERVER_PORT, SERVER_TIMEOUT, holder);
+ server.startServer();
+ }
+
+ @BeforeEach
+ public void setUp() throws ItemNotFoundException {
+ registerService(volatileStorageService);
+
+ managedThingProvider = getService(ThingProvider.class, ManagedThingProvider.class);
+ assertThat("Could not get ManagedThingProvider", managedThingProvider, is(notNullValue()));
+
+ thingTypeRegistry = getService(ThingTypeRegistry.class);
+ assertThat("Could not get ThingTypeRegistry", thingTypeRegistry, is(notNullValue()));
+
+ channelTypeRegistry = getService(ChannelTypeRegistry.class);
+ assertThat("Could not get ChannelTypeRegistry", channelTypeRegistry, is(notNullValue()));
+
+ channelGroupTypeRegistry = getService(ChannelGroupTypeRegistry.class);
+ assertThat("Could not get ChannelGroupTypeRegistry", channelGroupTypeRegistry, is(notNullValue()));
+
+ eventPublisher = getService(EventPublisher.class);
+ assertThat("Could not get EventPublisher", eventPublisher, is(notNullValue()));
+
+ itemFactory = getService(ItemFactory.class);
+ assertThat("Could not get ItemFactory", itemFactory, is(notNullValue()));
+
+ itemRegistry = getService(ItemRegistry.class);
+ assertThat("Could not get ItemRegistry", itemRegistry, is(notNullValue()));
+
+ managedItemChannelLinkProvider = getService(ManagedItemChannelLinkProvider.class);
+ assertThat("Could not get ManagedItemChannelLinkProvider", managedItemChannelLinkProvider, is(notNullValue()));
+
+ clientBuilder = getService(ClientBuilder.class);
+ assertThat("Could not get ClientBuilder", clientBuilder, is(notNullValue()));
+
+ eventSourceFactory = getService(SseEventSourceFactory.class);
+ assertThat("Could not get SseEventSourceFactory", eventSourceFactory, is(notNullValue()));
+
+ ComponentContext componentContext = mock(ComponentContext.class);
+ when(componentContext.getBundleContext()).thenReturn(bundleContext);
+
+ nestTestHandlerFactory = new NestTestHandlerFactory(clientBuilder, eventSourceFactory);
+ nestTestHandlerFactory.activate(componentContext,
+ Map.of(NestTestHandlerFactory.REDIRECT_URL_CONFIG_PROPERTY, REDIRECT_URL));
+ registerService(nestTestHandlerFactory);
+
+ nestTestHandlerFactory = getService(ThingHandlerFactory.class, NestTestHandlerFactory.class);
+ assertThat("Could not get NestTestHandlerFactory", nestTestHandlerFactory, is(notNullValue()));
+
+ bridge = buildBridge();
+ thing = buildThing(bridge);
+
+ bridgeHandler = addThing(bridge, NestTestBridgeHandler.class);
+ thingHandler = addThing(thing, thingClass);
+
+ createAndLinkItems();
+ assertThatAllItemStatesAreNull();
+ }
+
+ @AfterEach
+ public void tearDown() {
+ servlet.reset();
+ servlet.closeConnections();
+
+ if (thing != null) {
+ managedThingProvider.remove(thing.getUID());
+ }
+ if (bridge != null) {
+ managedThingProvider.remove(bridge.getUID());
+ }
+
+ unregisterService(volatileStorageService);
+ }
+
+ protected Bridge buildBridge() {
+ Map<String, Object> properties = new HashMap<>();
+ properties.put(NestBridgeConfiguration.ACCESS_TOKEN,
+ "c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc");
+ properties.put(NestBridgeConfiguration.PINCODE, "64P2XRYT");
+ properties.put(NestBridgeConfiguration.PRODUCT_ID, "8fdf9885-ca07-4252-1aa3-f3d5ca9589e0");
+ properties.put(NestBridgeConfiguration.PRODUCT_SECRET, "QITLR3iyUlWaj9dbvCxsCKp4f");
+
+ return BridgeBuilder.create(NestTestBridgeHandler.THING_TYPE_TEST_BRIDGE, "test_account")
+ .withLabel("Test Account").withConfiguration(new Configuration(properties)).build();
+ }
+
+ protected abstract Thing buildThing(Bridge bridge);
+
+ protected List<Channel> buildChannels(ThingTypeUID thingTypeUID, ThingUID thingUID) {
+ waitForAssert(() -> assertThat(thingTypeRegistry.getThingType(thingTypeUID), notNullValue()));
+
+ ThingType thingType = thingTypeRegistry.getThingType(thingTypeUID);
+
+ List<Channel> channels = new ArrayList<>();
+ channels.addAll(buildChannels(thingUID, thingType.getChannelDefinitions(), (id) -> id));
+
+ for (ChannelGroupDefinition channelGroupDefinition : thingType.getChannelGroupDefinitions()) {
+ ChannelGroupType channelGroupType = channelGroupTypeRegistry
+ .getChannelGroupType(channelGroupDefinition.getTypeUID());
+ String groupId = channelGroupDefinition.getId();
+ if (channelGroupType != null) {
+ channels.addAll(
+ buildChannels(thingUID, channelGroupType.getChannelDefinitions(), (id) -> groupId + "#" + id));
+ }
+ }
+
+ channels.sort((Channel c1, Channel c2) -> c1.getUID().getId().compareTo(c2.getUID().getId()));
+ return channels;
+ }
+
+ protected List<Channel> buildChannels(ThingUID thingUID, List<ChannelDefinition> channelDefinitions,
+ Function<String, String> channelIdFunction) {
+ List<Channel> result = new ArrayList<>();
+ for (ChannelDefinition channelDefinition : channelDefinitions) {
+ ChannelType channelType = channelTypeRegistry.getChannelType(channelDefinition.getChannelTypeUID());
+ if (channelType != null) {
+ result.add(ChannelBuilder
+ .create(new ChannelUID(thingUID, channelIdFunction.apply(channelDefinition.getId())),
+ channelType.getItemType())
+ .build());
+ }
+ }
+ return result;
+ }
+
+ @SuppressWarnings("unchecked")
+ protected <T> T addThing(Thing thing, Class<T> thingHandlerClass) {
+ assertThat(thing.getHandler(), is(nullValue()));
+ managedThingProvider.add(thing);
+ waitForAssert(() -> assertThat(thing.getHandler(), notNullValue()));
+ assertThat(thing.getConfiguration(), is(notNullValue()));
+ assertThat(thing.getHandler(), is(instanceOf(thingHandlerClass)));
+ return (T) thing.getHandler();
+ }
+
+ protected String getThingId() {
+ return thing.getUID().getId();
+ }
+
+ protected ThingUID getThingUID() {
+ return thing.getUID();
+ }
+
+ protected void putStreamingEventData(String json) throws IOException {
+ String singleLineJson = json.replaceAll("\n\r\\s+", "").replaceAll("\n\\s+", "").replaceAll("\n\r", "")
+ .replaceAll("\n", "");
+ servlet.queueEvent(PUT, singleLineJson);
+ }
+
+ protected void createAndLinkItems() {
+ thing.getChannels().forEach(c -> {
+ String itemName = getItemName(c.getUID().getId());
+ Item item = itemFactory.createItem(c.getAcceptedItemType(), itemName);
+ if (item != null) {
+ itemRegistry.add(item);
+ }
+ managedItemChannelLinkProvider.add(new ItemChannelLink(itemName, c.getUID()));
+ });
+ }
+
+ protected void assertThatItemHasState(String channelId, State state) {
+ waitForAssert(() -> assertThat("Wrong state for item of channel '" + channelId + "' ", getItemState(channelId),
+ is(state)));
+ }
+
+ protected void assertThatItemHasNotState(String channelId, State state) {
+ waitForAssert(() -> assertThat("Wrong state for item of channel '" + channelId + "' ", getItemState(channelId),
+ is(not(state))));
+ }
+
+ protected void assertThatAllItemStatesAreNull() {
+ thing.getChannels().forEach(c -> assertThatItemHasState(c.getUID().getId(), UnDefType.NULL));
+ }
+
+ protected void assertThatAllItemStatesAreNotNull() {
+ thing.getChannels().forEach(c -> assertThatItemHasNotState(c.getUID().getId(), UnDefType.NULL));
+ }
+
+ protected ChannelUID getChannelUID(String channelId) {
+ return new ChannelUID(getThingUID(), channelId);
+ }
+
+ protected String getItemName(String channelId) {
+ return getThingId() + "_" + channelId.replaceAll("#", "_");
+ }
+
+ private State getItemState(String channelId) {
+ String itemName = getItemName(channelId);
+ try {
+ return itemRegistry.getItem(itemName).getState();
+ } catch (ItemNotFoundException e) {
+ throw new AssertionError("Item with name '" + itemName + "' not found");
+ }
+ }
+
+ protected void logItemStates() {
+ thing.getChannels().forEach(c -> {
+ String channelId = c.getUID().getId();
+ String itemName = getItemName(channelId);
+ logger.debug("{} = {}", itemName, getItemState(channelId));
+ });
+ }
+
+ protected void updateAllItemStatesToNull() {
+ thing.getChannels().forEach(c -> updateItemState(c.getUID().getId(), UnDefType.NULL));
+ }
+
+ protected void refreshAllChannels() {
+ thing.getChannels().forEach(c -> thingHandler.handleCommand(c.getUID(), RefreshType.REFRESH));
+ }
+
+ protected void handleCommand(String channelId, Command command) {
+ thingHandler.handleCommand(getChannelUID(channelId), command);
+ }
+
+ protected void updateItemState(String channelId, State state) {
+ String itemName = getItemName(channelId);
+ eventPublisher.post(ItemEventFactory.createStateEvent(itemName, state));
+ }
+
+ protected void assertNestApiPropertyState(String nestId, String propertyName, String state) {
+ waitForAssert(() -> assertThat(servlet.getNestIdPropertyState(nestId, propertyName), is(state)));
+ }
+
+ public static DateTimeType parseDateTimeType(String text) {
+ try {
+ return new DateTimeType(Instant.parse(text).atZone(TimeZone.getDefault().toZoneId()));
+ } catch (DateTimeParseException e) {
+ throw new IllegalArgumentException("Invalid date time argument: " + text, e);
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import static org.junit.jupiter.api.Assertions.*;
+import static org.openhab.binding.nest.internal.data.NestDataUtil.*;
+
+import java.io.IOException;
+import java.text.SimpleDateFormat;
+import java.util.Date;
+
+import org.junit.jupiter.api.Test;
+import org.openhab.core.library.unit.SIUnits;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Test cases for gson parsing of model classes
+ *
+ * @author David Bennett - Initial contribution
+ * @author Wouter Born - Increase test coverage
+ */
+public class GsonParsingTest {
+
+ private final Logger logger = LoggerFactory.getLogger(GsonParsingTest.class);
+
+ private static void assertEqualDateTime(String expected, Date actual) {
+ SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'");
+ assertEquals(expected, sdf.format(actual));
+ }
+
+ @Test
+ public void verifyCompleteInput() throws IOException {
+ TopLevelData topLevel = fromJson("top-level-data.json", TopLevelData.class);
+
+ assertEquals(topLevel.getDevices().getThermostats().size(), 1);
+ assertNotNull(topLevel.getDevices().getThermostats().get(THERMOSTAT1_DEVICE_ID));
+ assertEquals(topLevel.getDevices().getCameras().size(), 2);
+ assertNotNull(topLevel.getDevices().getCameras().get(CAMERA1_DEVICE_ID));
+ assertNotNull(topLevel.getDevices().getCameras().get(CAMERA2_DEVICE_ID));
+ assertEquals(topLevel.getDevices().getSmokeCoAlarms().size(), 4);
+ assertNotNull(topLevel.getDevices().getSmokeCoAlarms().get(SMOKE1_DEVICE_ID));
+ assertNotNull(topLevel.getDevices().getSmokeCoAlarms().get(SMOKE2_DEVICE_ID));
+ assertNotNull(topLevel.getDevices().getSmokeCoAlarms().get(SMOKE3_DEVICE_ID));
+ assertNotNull(topLevel.getDevices().getSmokeCoAlarms().get(SMOKE4_DEVICE_ID));
+ }
+
+ @Test
+ public void verifyCompleteStreamingInput() throws IOException {
+ TopLevelStreamingData topLevelStreamingData = fromJson("top-level-streaming-data.json",
+ TopLevelStreamingData.class);
+
+ assertEquals("/", topLevelStreamingData.getPath());
+
+ TopLevelData data = topLevelStreamingData.getData();
+ assertEquals(data.getDevices().getThermostats().size(), 1);
+ assertNotNull(data.getDevices().getThermostats().get(THERMOSTAT1_DEVICE_ID));
+ assertEquals(data.getDevices().getCameras().size(), 2);
+ assertNotNull(data.getDevices().getCameras().get(CAMERA1_DEVICE_ID));
+ assertNotNull(data.getDevices().getCameras().get(CAMERA2_DEVICE_ID));
+ assertEquals(data.getDevices().getSmokeCoAlarms().size(), 4);
+ assertNotNull(data.getDevices().getSmokeCoAlarms().get(SMOKE1_DEVICE_ID));
+ assertNotNull(data.getDevices().getSmokeCoAlarms().get(SMOKE2_DEVICE_ID));
+ assertNotNull(data.getDevices().getSmokeCoAlarms().get(SMOKE3_DEVICE_ID));
+ assertNotNull(data.getDevices().getSmokeCoAlarms().get(SMOKE4_DEVICE_ID));
+ }
+
+ @Test
+ public void verifyThermostat() throws IOException {
+ Thermostat thermostat = fromJson("thermostat-data.json", Thermostat.class);
+ logger.debug("Thermostat: {}", thermostat);
+
+ assertTrue(thermostat.isOnline());
+ assertTrue(thermostat.isCanHeat());
+ assertTrue(thermostat.isHasLeaf());
+ assertFalse(thermostat.isCanCool());
+ assertFalse(thermostat.isFanTimerActive());
+ assertFalse(thermostat.isLocked());
+ assertFalse(thermostat.isSunlightCorrectionActive());
+ assertTrue(thermostat.isSunlightCorrectionEnabled());
+ assertFalse(thermostat.isUsingEmergencyHeat());
+ assertEquals(THERMOSTAT1_DEVICE_ID, thermostat.getDeviceId());
+ assertEquals(Integer.valueOf(15), thermostat.getFanTimerDuration());
+ assertEqualDateTime("2017-02-02T21:00:06.000Z", thermostat.getLastConnection());
+ assertEqualDateTime("1970-01-01T00:00:00.000Z", thermostat.getFanTimerTimeout());
+ assertEquals(Double.valueOf(24.0), thermostat.getEcoTemperatureHigh());
+ assertEquals(Double.valueOf(12.5), thermostat.getEcoTemperatureLow());
+ assertEquals(Double.valueOf(22.0), thermostat.getLockedTempMax());
+ assertEquals(Double.valueOf(20.0), thermostat.getLockedTempMin());
+ assertEquals(Thermostat.Mode.HEAT, thermostat.getMode());
+ assertEquals("Living Room (Living Room)", thermostat.getName());
+ assertEquals("Living Room Thermostat (Living Room)", thermostat.getNameLong());
+ assertEquals(null, thermostat.getPreviousHvacMode());
+ assertEquals("5.6-7", thermostat.getSoftwareVersion());
+ assertEquals(Thermostat.State.OFF, thermostat.getHvacState());
+ assertEquals(STRUCTURE1_STRUCTURE_ID, thermostat.getStructureId());
+ assertEquals(Double.valueOf(15.5), thermostat.getTargetTemperature());
+ assertEquals(Double.valueOf(24.0), thermostat.getTargetTemperatureHigh());
+ assertEquals(Double.valueOf(20.0), thermostat.getTargetTemperatureLow());
+ assertEquals(SIUnits.CELSIUS, thermostat.getTemperatureUnit());
+ assertEquals(Integer.valueOf(0), thermostat.getTimeToTarget());
+ assertEquals(THERMOSTAT1_WHERE_ID, thermostat.getWhereId());
+ assertEquals("Living Room", thermostat.getWhereName());
+ }
+
+ @Test
+ public void thermostatTimeToTargetSupportedValueParsing() {
+ assertEquals((Integer) 0, Thermostat.parseTimeToTarget("~0"));
+ assertEquals((Integer) 5, Thermostat.parseTimeToTarget("<5"));
+ assertEquals((Integer) 10, Thermostat.parseTimeToTarget("<10"));
+ assertEquals((Integer) 15, Thermostat.parseTimeToTarget("~15"));
+ assertEquals((Integer) 90, Thermostat.parseTimeToTarget("~90"));
+ assertEquals((Integer) 120, Thermostat.parseTimeToTarget(">120"));
+ }
+
+ @Test
+ public void thermostatTimeToTargetUnsupportedValueParsing() {
+ assertThrows(NumberFormatException.class, () -> Thermostat.parseTimeToTarget("#5"));
+ }
+
+ @Test
+ public void verifyCamera() throws IOException {
+ Camera camera = fromJson("camera-data.json", Camera.class);
+ logger.debug("Camera: {}", camera);
+
+ assertTrue(camera.isOnline());
+ assertEquals("Upstairs", camera.getName());
+ assertEquals("Upstairs Camera", camera.getNameLong());
+ assertEquals(STRUCTURE1_STRUCTURE_ID, camera.getStructureId());
+ assertEquals(CAMERA1_WHERE_ID, camera.getWhereId());
+ assertTrue(camera.isAudioInputEnabled());
+ assertFalse(camera.isPublicShareEnabled());
+ assertFalse(camera.isStreaming());
+ assertFalse(camera.isVideoHistoryEnabled());
+ assertEquals("https://camera_app_url", camera.getAppUrl());
+ assertEquals(CAMERA1_DEVICE_ID, camera.getDeviceId());
+ assertNull(camera.getLastConnection());
+ assertEqualDateTime("2017-01-22T08:19:20.000Z", camera.getLastIsOnlineChange());
+ assertNull(camera.getPublicShareUrl());
+ assertEquals("https://camera_snapshot_url", camera.getSnapshotUrl());
+ assertEquals("205-600052", camera.getSoftwareVersion());
+ assertEquals("https://camera_web_url", camera.getWebUrl());
+ assertEquals("https://last_event_animated_image_url", camera.getLastEvent().getAnimatedImageUrl());
+ assertEquals(2, camera.getLastEvent().getActivityZones().size());
+ assertEquals("id1", camera.getLastEvent().getActivityZones().get(0));
+ assertEquals("https://last_event_app_url", camera.getLastEvent().getAppUrl());
+ assertEqualDateTime("2017-01-22T07:40:38.680Z", camera.getLastEvent().getEndTime());
+ assertEquals("https://last_event_image_url", camera.getLastEvent().getImageUrl());
+ assertEqualDateTime("2017-01-22T07:40:19.020Z", camera.getLastEvent().getStartTime());
+ assertEqualDateTime("2017-02-05T07:40:19.020Z", camera.getLastEvent().getUrlsExpireTime());
+ assertEquals("https://last_event_web_url", camera.getLastEvent().getWebUrl());
+ assertTrue(camera.getLastEvent().isHasMotion());
+ assertFalse(camera.getLastEvent().isHasPerson());
+ assertFalse(camera.getLastEvent().isHasSound());
+ }
+
+ @Test
+ public void verifySmokeDetector() throws IOException {
+ SmokeDetector smokeDetector = fromJson("smoke-detector-data.json", SmokeDetector.class);
+ logger.debug("SmokeDetector: {}", smokeDetector);
+
+ assertTrue(smokeDetector.isOnline());
+ assertEquals(SMOKE1_WHERE_ID, smokeDetector.getWhereId());
+ assertEquals(SMOKE1_DEVICE_ID, smokeDetector.getDeviceId());
+ assertEquals("Downstairs", smokeDetector.getName());
+ assertEquals("Downstairs Nest Protect", smokeDetector.getNameLong());
+ assertEqualDateTime("2017-02-02T20:53:05.338Z", smokeDetector.getLastConnection());
+ assertEquals(SmokeDetector.BatteryHealth.OK, smokeDetector.getBatteryHealth());
+ assertEquals(SmokeDetector.AlarmState.OK, smokeDetector.getCoAlarmState());
+ assertEquals(SmokeDetector.AlarmState.OK, smokeDetector.getSmokeAlarmState());
+ assertEquals("3.1rc9", smokeDetector.getSoftwareVersion());
+ assertEquals(STRUCTURE1_STRUCTURE_ID, smokeDetector.getStructureId());
+ assertEquals(SmokeDetector.UiColorState.GREEN, smokeDetector.getUiColorState());
+ }
+
+ @Test
+ public void verifyAccessToken() throws IOException {
+ AccessTokenData accessToken = fromJson("access-token-data.json", AccessTokenData.class);
+ logger.debug("AccessTokenData: {}", accessToken);
+
+ assertEquals("access_token", accessToken.getAccessToken());
+ assertEquals(Long.valueOf(315360000L), accessToken.getExpiresIn());
+ }
+
+ @Test
+ public void verifyStructure() throws IOException {
+ Structure structure = fromJson("structure-data.json", Structure.class);
+ logger.debug("Structure: {}", structure);
+
+ assertEquals("Home", structure.getName());
+ assertEquals("US", structure.getCountryCode());
+ assertEquals("98056", structure.getPostalCode());
+ assertEquals(Structure.HomeAwayState.HOME, structure.getAway());
+ assertEqualDateTime("2017-02-02T03:10:08.000Z", structure.getEtaBegin());
+ assertNull(structure.getEta());
+ assertNull(structure.getPeakPeriodEndTime());
+ assertNull(structure.getPeakPeriodStartTime());
+ assertEquals(STRUCTURE1_STRUCTURE_ID, structure.getStructureId());
+ assertEquals("America/Los_Angeles", structure.getTimeZone());
+ assertFalse(structure.isRhrEnrollment());
+ }
+
+ @Test
+ public void verifyError() throws IOException {
+ ErrorData error = fromJson("error-data.json", ErrorData.class);
+ logger.debug("ErrorData: {}", error);
+
+ assertEquals("blocked", error.getError());
+ assertEquals("https://developer.nest.com/documentation/cloud/error-messages#blocked", error.getType());
+ assertEquals("blocked", error.getMessage());
+ assertEquals("bb514046-edc9-4bca-8239-f7a3cfb0925a", error.getInstance());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.internal.data;
+
+import java.io.BufferedReader;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.Reader;
+import java.io.UnsupportedEncodingException;
+import java.util.stream.Collectors;
+
+import javax.measure.Unit;
+import javax.measure.quantity.Temperature;
+
+import org.openhab.binding.nest.internal.NestUtils;
+import org.openhab.core.library.unit.ImperialUnits;
+import org.openhab.core.library.unit.SIUnits;
+
+/**
+ * Utility class for working with Nest test data in unit tests.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public final class NestDataUtil {
+
+ public static final String COMPLETE_DATA_FILE_NAME = "top-level-streaming-data.json";
+ public static final String INCOMPLETE_DATA_FILE_NAME = "top-level-streaming-data-incomplete.json";
+ public static final String EMPTY_DATA_FILE_NAME = "top-level-streaming-data-empty.json";
+
+ public static final String CAMERA1_DEVICE_ID = "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ";
+ public static final String CAMERA1_WHERE_ID = "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA";
+
+ public static final String CAMERA2_DEVICE_ID = "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ";
+ public static final String CAMERA2_WHERE_ID = "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ";
+
+ public static final String SMOKE1_DEVICE_ID = "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV";
+ public static final String SMOKE1_WHERE_ID = "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg";
+
+ public static final String SMOKE2_DEVICE_ID = "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV";
+ public static final String SMOKE2_WHERE_ID = "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA";
+
+ public static final String SMOKE3_DEVICE_ID = "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV";
+ public static final String SMOKE3_WHERE_ID = "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ";
+
+ public static final String SMOKE4_DEVICE_ID = "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV";
+ public static final String SMOKE4_WHERE_ID = "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw";
+
+ public static final String STRUCTURE1_STRUCTURE_ID = "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A";
+
+ public static final String THERMOSTAT1_DEVICE_ID = "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV";
+ public static final String THERMOSTAT1_WHERE_ID = "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw";
+
+ private NestDataUtil() {
+ // Hidden utility class constructor
+ }
+
+ public static Reader openDataReader(String fileName) throws UnsupportedEncodingException {
+ String packagePath = (NestDataUtil.class.getPackage().getName()).replaceAll("\\.", "/");
+ String filePath = "/" + packagePath + "/" + fileName;
+ InputStream inputStream = NestDataUtil.class.getClassLoader().getResourceAsStream(filePath);
+ return new InputStreamReader(inputStream, "UTF-8");
+ }
+
+ public static <T> T fromJson(String fileName, Class<T> dataClass) throws IOException {
+ try (Reader reader = openDataReader(fileName)) {
+ return NestUtils.fromJson(reader, dataClass);
+ }
+ }
+
+ public static String fromFile(String fileName, Unit<Temperature> temperatureUnit) throws IOException {
+ String json = fromFile(fileName);
+ if (temperatureUnit == SIUnits.CELSIUS) {
+ json = json.replace("\"temperature_scale\": \"F\"", "\"temperature_scale\": \"C\"");
+ } else if (temperatureUnit == ImperialUnits.FAHRENHEIT) {
+ json = json.replace("\"temperature_scale\": \"C\"", "\"temperature_scale\": \"F\"");
+ }
+ return json;
+ }
+
+ public static String fromFile(String fileName) throws IOException {
+ try (Reader reader = openDataReader(fileName)) {
+ return new BufferedReader(reader).lines().parallel().collect(Collectors.joining("\n"));
+ }
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.test;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.*;
+import static org.openhab.binding.nest.internal.rest.NestStreamingRestClient.*;
+
+import java.io.IOException;
+import java.io.InputStreamReader;
+import java.io.PrintWriter;
+import java.util.HashMap;
+import java.util.Map;
+import java.util.Queue;
+import java.util.Set;
+import java.util.concurrent.ArrayBlockingQueue;
+import java.util.concurrent.ConcurrentHashMap;
+import java.util.concurrent.TimeUnit;
+
+import javax.servlet.ServletException;
+import javax.servlet.http.HttpServlet;
+import javax.servlet.http.HttpServletRequest;
+import javax.servlet.http.HttpServletResponse;
+
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import com.google.gson.Gson;
+import com.google.gson.GsonBuilder;
+import com.google.gson.reflect.TypeToken;
+
+/**
+ * The {@link NestTestApiServlet} mocks the Nest API during tests.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestTestApiServlet extends HttpServlet {
+
+ private static final long serialVersionUID = -5414910055159062745L;
+
+ private static final String NEW_LINE = "\n";
+
+ private static final String UPDATE_PATHS[] = { NEST_CAMERA_UPDATE_PATH, NEST_SMOKE_ALARM_UPDATE_PATH,
+ NEST_STRUCTURE_UPDATE_PATH, NEST_THERMOSTAT_UPDATE_PATH };
+
+ private final Logger logger = LoggerFactory.getLogger(NestTestApiServlet.class);
+
+ private class SseEvent {
+ private String name;
+ private String data;
+
+ public SseEvent(String name) {
+ this.name = name;
+ }
+
+ public SseEvent(String name, String data) {
+ this.name = name;
+ this.data = data;
+ }
+
+ public String getData() {
+ return data;
+ }
+
+ public String getName() {
+ return name;
+ }
+
+ public boolean hasData() {
+ return data != null && !data.isEmpty();
+ }
+ }
+
+ private final Map<String, Map<String, String>> nestIdPropertiesMap = new ConcurrentHashMap<>();
+
+ private final Map<Thread, Queue<SseEvent>> listenerQueues = new ConcurrentHashMap<>();
+
+ private final ThreadLocal<PrintWriter> threadLocalWriter = new ThreadLocal<>();
+
+ private final Gson gson = new GsonBuilder().create();
+
+ public void closeConnections() {
+ Set<Thread> threads = listenerQueues.keySet();
+ listenerQueues.clear();
+ threads.forEach(thread -> thread.interrupt());
+ }
+
+ public void reset() {
+ nestIdPropertiesMap.clear();
+ }
+
+ public void queueEvent(String eventName) {
+ SseEvent event = new SseEvent(eventName);
+ listenerQueues.forEach((thread, queue) -> queue.add(event));
+ }
+
+ public void queueEvent(String eventName, String data) {
+ SseEvent event = new SseEvent(eventName, data);
+ listenerQueues.forEach((thread, queue) -> queue.add(event));
+ }
+
+ @SuppressWarnings("resource")
+ private void writeEvent(SseEvent event) {
+ logger.debug("Writing {} event", event.getName());
+
+ PrintWriter writer = threadLocalWriter.get();
+
+ writer.write("event: ");
+ writer.write(event.getName());
+ writer.write(NEW_LINE);
+
+ if (event.hasData()) {
+ for (String dataLine : event.getData().split(NEW_LINE)) {
+ writer.write("data: ");
+ writer.write(dataLine);
+ writer.write(NEW_LINE);
+ }
+ }
+
+ writer.write(NEW_LINE);
+ writer.flush();
+ }
+
+ private void writeEvent(String eventName) {
+ writeEvent(new SseEvent(eventName));
+ }
+
+ @Override
+ public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
+ ArrayBlockingQueue<SseEvent> queue = new ArrayBlockingQueue<>(10);
+ listenerQueues.put(Thread.currentThread(), queue);
+
+ response.setContentType("text/event-stream");
+ response.setCharacterEncoding("UTF-8");
+ response.flushBuffer();
+
+ logger.debug("Opened event stream to {}:{}", request.getRemoteHost(), request.getRemotePort());
+
+ PrintWriter writer = response.getWriter();
+ threadLocalWriter.set(writer);
+ writeEvent(OPEN);
+
+ while (listenerQueues.containsKey(Thread.currentThread()) && !writer.checkError()) {
+ try {
+ SseEvent event = queue.poll(KEEP_ALIVE_MILLIS, TimeUnit.MILLISECONDS);
+ if (event != null) {
+ writeEvent(event);
+ } else {
+ writeEvent(KEEP_ALIVE);
+ }
+ } catch (InterruptedException e) {
+ logger.debug("Evaluating loop conditions after interrupt");
+ }
+ }
+
+ listenerQueues.remove(Thread.currentThread());
+ threadLocalWriter.remove();
+ writer.close();
+
+ logger.debug("Closed event stream to {}:{}", request.getRemoteHost(), request.getRemotePort());
+ }
+
+ @Override
+ protected void doPut(HttpServletRequest request, HttpServletResponse response)
+ throws ServletException, IOException {
+ logger.debug("Received put request: {}", request);
+
+ String uri = request.getRequestURI();
+ String nestId = getNestIdFromURI(uri);
+
+ if (nestId == null) {
+ logger.error("Unsupported URI: {}", uri);
+ response.setStatus(HttpServletResponse.SC_INTERNAL_SERVER_ERROR);
+ return;
+ }
+
+ InputStreamReader reader = new InputStreamReader(request.getInputStream());
+ Map<String, String> propertiesUpdate = gson.fromJson(reader, new TypeToken<Map<String, String>>() {
+ }.getType());
+
+ Map<String, String> properties = getOrCreateProperties(nestId);
+ properties.putAll(propertiesUpdate);
+
+ gson.toJson(propertiesUpdate, response.getWriter());
+
+ response.setStatus(HttpServletResponse.SC_OK);
+ }
+
+ private String getNestIdFromURI(String uri) {
+ for (String updatePath : UPDATE_PATHS) {
+ if (uri.startsWith(updatePath)) {
+ return uri.replaceAll(updatePath, "");
+ }
+ }
+ return null;
+ }
+
+ private Map<String, String> getOrCreateProperties(String nestId) {
+ Map<String, String> properties = nestIdPropertiesMap.get(nestId);
+ if (properties == null) {
+ properties = new HashMap<>();
+ nestIdPropertiesMap.put(nestId, properties);
+ }
+ return properties;
+ }
+
+ public String getNestIdPropertyState(String nestId, String propertyName) {
+ Map<String, String> properties = nestIdPropertiesMap.get(nestId);
+ return properties == null ? null : properties.get(propertyName);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.test;
+
+import static org.openhab.binding.nest.internal.NestBindingConstants.BINDING_ID;
+
+import java.util.Collections;
+import java.util.Properties;
+import java.util.Set;
+
+import javax.ws.rs.client.ClientBuilder;
+
+import org.openhab.binding.nest.internal.exceptions.InvalidAccessTokenException;
+import org.openhab.binding.nest.internal.handler.NestBridgeHandler;
+import org.openhab.binding.nest.internal.handler.NestRedirectUrlSupplier;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.ThingTypeUID;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+
+/**
+ * The {@link NestTestBridgeHandler} is a {@link NestBridgeHandler} modified for testing. Using the
+ * {@link NestTestRedirectUrlSupplier} it will always connect to same provided {@link #redirectUrl}.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestTestBridgeHandler extends NestBridgeHandler {
+
+ class NestTestRedirectUrlSupplier extends NestRedirectUrlSupplier {
+
+ NestTestRedirectUrlSupplier(Properties httpHeaders) {
+ super(httpHeaders);
+ this.cachedUrl = redirectUrl;
+ }
+
+ @Override
+ public void resetCache() {
+ // Skip resetting the URL so the test server keeps being used
+ }
+ }
+
+ public static final ThingTypeUID THING_TYPE_TEST_BRIDGE = new ThingTypeUID(BINDING_ID, "test_account");
+ public static final Set<ThingTypeUID> SUPPORTED_THING_TYPES = Collections.singleton(THING_TYPE_TEST_BRIDGE);
+
+ private String redirectUrl;
+
+ public NestTestBridgeHandler(Bridge bridge, ClientBuilder clientBuilder, SseEventSourceFactory eventSourceFactory,
+ String redirectUrl) {
+ super(bridge, clientBuilder, eventSourceFactory);
+ this.redirectUrl = redirectUrl;
+ }
+
+ @Override
+ protected NestRedirectUrlSupplier createRedirectUrlSupplier() throws InvalidAccessTokenException {
+ return new NestTestRedirectUrlSupplier(getHttpHeaders());
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.test;
+
+import java.util.HashMap;
+import java.util.Hashtable;
+import java.util.Map;
+
+import javax.ws.rs.client.ClientBuilder;
+
+import org.eclipse.jdt.annotation.NonNullByDefault;
+import org.eclipse.jdt.annotation.Nullable;
+import org.openhab.binding.nest.internal.discovery.NestDiscoveryService;
+import org.openhab.binding.nest.internal.handler.NestBridgeHandler;
+import org.openhab.core.config.discovery.DiscoveryService;
+import org.openhab.core.thing.Bridge;
+import org.openhab.core.thing.Thing;
+import org.openhab.core.thing.ThingTypeUID;
+import org.openhab.core.thing.ThingUID;
+import org.openhab.core.thing.binding.BaseThingHandlerFactory;
+import org.openhab.core.thing.binding.ThingHandler;
+import org.openhab.core.thing.binding.ThingHandlerFactory;
+import org.osgi.framework.ServiceRegistration;
+import org.osgi.service.component.ComponentContext;
+import org.osgi.service.component.annotations.Activate;
+import org.osgi.service.component.annotations.Modified;
+import org.osgi.service.component.annotations.Reference;
+import org.osgi.service.jaxrs.client.SseEventSourceFactory;
+
+/**
+ * The {@link NestTestHandlerFactory} is responsible for creating test things and thing handlers.
+ *
+ * @author Wouter Born - Increase test coverage
+ */
+@NonNullByDefault
+public class NestTestHandlerFactory extends BaseThingHandlerFactory implements ThingHandlerFactory {
+
+ public static final String REDIRECT_URL_CONFIG_PROPERTY = "redirect.url";
+
+ private final ClientBuilder clientBuilder;
+ private final SseEventSourceFactory eventSourceFactory;
+ private final Map<ThingUID, ServiceRegistration<?>> discoveryService = new HashMap<>();
+
+ private String redirectUrl = "http://localhost";
+
+ @Activate
+ public NestTestHandlerFactory(@Reference ClientBuilder clientBuilder,
+ @Reference SseEventSourceFactory eventSourceFactory) {
+ this.clientBuilder = clientBuilder;
+ this.eventSourceFactory = eventSourceFactory;
+ }
+
+ @Override
+ public boolean supportsThingType(ThingTypeUID thingTypeUID) {
+ return NestTestBridgeHandler.SUPPORTED_THING_TYPES.contains(thingTypeUID);
+ }
+
+ @Activate
+ public void activate(ComponentContext componentContext, Map<String, Object> config) {
+ super.activate(componentContext);
+ modified(config);
+ }
+
+ @Modified
+ public void modified(Map<String, Object> config) {
+ String url = (String) config.get(REDIRECT_URL_CONFIG_PROPERTY);
+ if (url != null) {
+ this.redirectUrl = url;
+ }
+ }
+
+ @Override
+ protected @Nullable ThingHandler createHandler(Thing thing) {
+ ThingTypeUID thingTypeUID = thing.getThingTypeUID();
+ if (thingTypeUID.equals(NestTestBridgeHandler.THING_TYPE_TEST_BRIDGE)) {
+ NestTestBridgeHandler handler = new NestTestBridgeHandler((Bridge) thing, clientBuilder, eventSourceFactory,
+ redirectUrl);
+ NestDiscoveryService service = new NestDiscoveryService(handler);
+ // Register the discovery service.
+ discoveryService.put(handler.getThing().getUID(),
+ bundleContext.registerService(DiscoveryService.class.getName(), service, new Hashtable<>()));
+
+ return handler;
+ }
+ return null;
+ }
+
+ /**
+ * Removes the handler for the specific thing. This also handles disabling the discovery
+ * service when the bridge is removed.
+ */
+ @Override
+ protected void removeHandler(ThingHandler thingHandler) {
+ if (thingHandler instanceof NestBridgeHandler) {
+ ServiceRegistration<?> registration = discoveryService.get(thingHandler.getThing().getUID());
+ if (registration != null) {
+ // Unregister the discovery service.
+ NestDiscoveryService service = (NestDiscoveryService) bundleContext
+ .getService(registration.getReference());
+ service.deactivate();
+ registration.unregister();
+ discoveryService.remove(thingHandler.getThing().getUID());
+ }
+ }
+ super.removeHandler(thingHandler);
+ }
+}
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.binding.nest.test;
+
+import org.eclipse.jetty.server.Server;
+import org.eclipse.jetty.server.ServerConnector;
+import org.eclipse.jetty.servlet.ServletHandler;
+import org.eclipse.jetty.servlet.ServletHolder;
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+/**
+ * Embedded jetty server used in the tests.
+ *
+ * Based on {@code TestServer} of the FS Internet Radio Binding.
+ *
+ * @author Velin Yordanov - initial contribution
+ * @author Wouter Born - Increase test coverage
+ */
+public class NestTestServer {
+ private final Logger logger = LoggerFactory.getLogger(NestTestServer.class);
+
+ private Server server;
+ private String host;
+ private int port;
+ private int timeout;
+ private ServletHolder servletHolder;
+
+ public NestTestServer(String host, int port, int timeout, ServletHolder servletHolder) {
+ this.host = host;
+ this.port = port;
+ this.timeout = timeout;
+ this.servletHolder = servletHolder;
+ }
+
+ public void startServer() {
+ Thread thread = new Thread(new Runnable() {
+ @Override
+ @SuppressWarnings("resource")
+ public void run() {
+ server = new Server();
+ ServletHandler handler = new ServletHandler();
+ handler.addServletWithMapping(servletHolder, "/*");
+ server.setHandler(handler);
+
+ // HTTP connector
+ ServerConnector http = new ServerConnector(server);
+ http.setHost(host);
+ http.setPort(port);
+ http.setIdleTimeout(timeout);
+
+ server.addConnector(http);
+
+ try {
+ server.start();
+ server.join();
+ } catch (InterruptedException ex) {
+ logger.error("Server got interrupted", ex);
+ return;
+ } catch (Exception e) {
+ logger.error("Error in starting the server", e);
+ return;
+ }
+ }
+ });
+
+ thread.start();
+ }
+
+ public void stopServer() {
+ try {
+ server.stop();
+ } catch (Exception e) {
+ logger.error("Error in stopping the server", e);
+ return;
+ }
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<thing:thing-descriptions bindingId="nest"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xmlns:thing="https://openhab.org/schemas/thing-description/v1.0.0"
+ xsi:schemaLocation="https://openhab.org/schemas/thing-description/v1.0.0 https://openhab.org/schemas/thing-description-1.0.0.xsd">
+
+ <bridge-type id="test_account">
+ <label>Test Account</label>
+ <description>An account for testing the Nest binding</description>
+ <config-description-ref uri="thing-type:nest:account"/>
+ </bridge-type>
+</thing:thing-descriptions>
--- /dev/null
+{
+ "access_token": "access_token",
+ "expires_in": 315360000
+}
--- /dev/null
+{
+ "app_url": "https://camera_app_url",
+ "device_id": "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ",
+ "is_audio_input_enabled": true,
+ "is_online": true,
+ "is_public_share_enabled": false,
+ "is_streaming": false,
+ "is_video_history_enabled": false,
+ "last_event": {
+ "activity_zone_ids": [
+ "id1",
+ "id2"
+ ],
+ "animated_image_url": "https://last_event_animated_image_url",
+ "app_url": "https://last_event_app_url",
+ "end_time": "2017-01-22T07:40:38.680Z",
+ "has_motion": true,
+ "has_person": false,
+ "has_sound": false,
+ "image_url": "https://last_event_image_url",
+ "start_time": "2017-01-22T07:40:19.020Z",
+ "urls_expire_time": "2017-02-05T07:40:19.020Z",
+ "web_url": "https://last_event_web_url"
+ },
+ "last_is_online_change": "2017-01-22T08:19:20.000Z",
+ "name": "Upstairs",
+ "name_long": "Upstairs Camera",
+ "snapshot_url": "https://camera_snapshot_url",
+ "software_version": "205-600052",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "web_url": "https://camera_web_url",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA"
+}
--- /dev/null
+{
+ "error": "blocked",
+ "type": "https://developer.nest.com/documentation/cloud/error-messages#blocked",
+ "message": "blocked",
+ "instance": "bb514046-edc9-4bca-8239-f7a3cfb0925a"
+}
--- /dev/null
+{
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T20:53:05.338Z",
+ "locale": "en-US",
+ "name": "Downstairs",
+ "name_long": "Downstairs Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg",
+ "where_name": "Downstairs"
+}
--- /dev/null
+{
+ "smoke_co_alarms": [
+ "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV",
+ "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV",
+ "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV",
+ "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV"
+ ],
+ "name": "Home",
+ "country_code": "US",
+ "postal_code": "98056",
+ "time_zone": "America/Los_Angeles",
+ "away": "home",
+ "thermostats": [
+ "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV"
+ ],
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "rhr_enrollment": false,
+ "co_alarm_state": "ok",
+ "smoke_alarm_state": "ok",
+ "eta_begin": "2017-02-02T03:10:08.000Z",
+ "wwn_security_state": "ok",
+ "wheres": {
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIYpqdaXnYjUg": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIYpqdaXnYjUg",
+ "name": "Basement"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK-nCnEjccnMQ": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK-nCnEjccnMQ",
+ "name": "Bedroom"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJyRQEOtmKqkw": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJyRQEOtmKqkw",
+ "name": "Den"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKZphUIYeW39g": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKZphUIYeW39g",
+ "name": "Dining Room"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg",
+ "name": "Downstairs"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK2kdsXRP3IFg": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK2kdsXRP3IFg",
+ "name": "Entryway"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIAYVvcpN1cOA": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIAYVvcpN1cOA",
+ "name": "Family Room"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB7GULj0y7Rw": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB7GULj0y7Rw",
+ "name": "Hallway"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIbTUmML4Q6xA": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIbTUmML4Q6xA",
+ "name": "Kids Room"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB2f05cPKRBA": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB2f05cPKRBA",
+ "name": "Kitchen"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw",
+ "name": "Living Room"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIebdVzhA62Iw": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIebdVzhA62Iw",
+ "name": "Master Bedroom"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKtUyRb3je64Q": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKtUyRb3je64Q",
+ "name": "Office"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA",
+ "name": "Upstairs"
+ },
+ "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ": {
+ "where_id": "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ",
+ "name": "Downstairs Kitchen"
+ },
+ "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ": {
+ "where_id": "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ",
+ "name": "Garage"
+ },
+ "8tH6YiXUAQDZFLD6AgMmQ14Sc5wTG0NxKfabPY0XKrqc47t3uSDZvQ": {
+ "where_id": "8tH6YiXUAQDZFLD6AgMmQ14Sc5wTG0NxKfabPY0XKrqc47t3uSDZvQ",
+ "name": "Frog"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKfexoqPTcUVA": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKfexoqPTcUVA",
+ "name": "Backyard"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJv12iEHQ0hxA": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJv12iEHQ0hxA",
+ "name": "Driveway"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsLRu9lIioI47g": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsLRu9lIioI47g",
+ "name": "Front Yard"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKR8TWb9hTptQ": {
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKR8TWb9hTptQ",
+ "name": "Outside"
+ }
+ },
+ "cameras": [
+ "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ",
+ "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ"
+ ]
+}
--- /dev/null
+{
+ "ambient_temperature_c": 19.0,
+ "ambient_temperature_f": 66,
+ "away_temperature_high_c": 24.0,
+ "away_temperature_high_f": 76,
+ "away_temperature_low_c": 12.5,
+ "away_temperature_low_f": 55,
+ "can_cool": false,
+ "can_heat": true,
+ "device_id": "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV",
+ "eco_temperature_high_c": 24.0,
+ "eco_temperature_high_f": 76,
+ "eco_temperature_low_c": 12.5,
+ "eco_temperature_low_f": 55,
+ "fan_timer_active": false,
+ "fan_timer_duration": 15,
+ "fan_timer_timeout": "1970-01-01T00:00:00.000Z",
+ "has_fan": true,
+ "has_leaf": true,
+ "humidity": 25,
+ "hvac_mode": "heat",
+ "hvac_state": "off",
+ "is_locked": false,
+ "is_online": true,
+ "is_using_emergency_heat": false,
+ "label": "Living Room",
+ "last_connection": "2017-02-02T21:00:06.000Z",
+ "locale": "en-GB",
+ "locked_temp_max_c": 22.0,
+ "locked_temp_max_f": 72,
+ "locked_temp_min_c": 20.0,
+ "locked_temp_min_f": 68,
+ "name": "Living Room (Living Room)",
+ "name_long": "Living Room Thermostat (Living Room)",
+ "previous_hvac_mode": "",
+ "software_version": "5.6-7",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "sunlight_correction_active": false,
+ "sunlight_correction_enabled": true,
+ "target_temperature_c": 15.5,
+ "target_temperature_f": 60,
+ "target_temperature_high_c": 24.0,
+ "target_temperature_high_f": 75,
+ "target_temperature_low_c": 20.0,
+ "target_temperature_low_f": 68,
+ "temperature_scale": "C",
+ "time_to_target": "~0",
+ "time_to_target_training": "ready",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw",
+ "where_name": "Living Room"
+}
--- /dev/null
+{
+ "devices": {
+ "cameras": {
+ "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ": {
+ "app_url": "https://camera_app_url",
+ "device_id": "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ",
+ "is_audio_input_enabled": true,
+ "is_online": false,
+ "is_public_share_enabled": false,
+ "is_streaming": false,
+ "is_video_history_enabled": false,
+ "last_event": {
+ "activity_zone_ids": [
+ "id1",
+ "id2"
+ ],
+ "animated_image_url": "https://last_event_animated_image_url",
+ "app_url": "https://last_event_app_url",
+ "end_time": "2017-01-22T07:40:38.680Z",
+ "has_motion": true,
+ "has_person": false,
+ "has_sound": false,
+ "image_url": "https://last_event_image_url",
+ "start_time": "2017-01-22T07:40:19.020Z",
+ "urls_expire_time": "2017-02-05T07:40:19.020Z",
+ "web_url": "https://last_event_web_url"
+ },
+ "last_is_online_change": "2017-01-22T08:19:20.000Z",
+ "name": "Upstairs",
+ "name_long": "Upstairs Camera",
+ "snapshot_url": "https://camera_snapshot_url",
+ "software_version": "205-600052",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "web_url": "https://camera_web_url",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA"
+ },
+ "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ": {
+ "app_url": "nestmobile://cameras/CjZWRzdDN0JVNlpmOE9qRWZpem1CQ1Zud251S0hTbk9CSUhnYlFLYTU3eEtKenJ2b2tLX0R6RlESFm9wNVB2NW93NmJ6cUdvMkZQSGUxdEEaNld0Mkl5b2tIR0tKX2FpUVd1SkRnQjc2ejhSWFl3SFFxWXFrSWx2QlpxN1gyeWNqdmRZVjdGQQ?auth=c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "device_id": "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ",
+ "is_audio_input_enabled": true,
+ "is_online": false,
+ "is_public_share_enabled": false,
+ "is_streaming": false,
+ "is_video_history_enabled": false,
+ "last_event": {
+ "end_time": "2016-11-20T07:02:46.860Z",
+ "has_motion": true,
+ "has_person": false,
+ "has_sound": false,
+ "start_time": "2016-11-20T07:02:27.260Z"
+ },
+ "last_is_online_change": "2016-11-20T07:03:42.000Z",
+ "name": "Garage",
+ "name_long": "Garage Camera",
+ "snapshot_url": "https://www.dropcam.com/api/wwn.get_snapshot/CjZWRzdDN0JVNlpmOE9qRWZpem1CQ1Zud251S0hTbk9CSUhnYlFLYTU3eEtKenJ2b2tLX0R6RlESFm9wNVB2NW93NmJ6cUdvMkZQSGUxdEEaNld0Mkl5b2tIR0tKX2FpUVd1SkRnQjc2ejhSWFl3SFFxWXFrSWx2QlpxN1gyeWNqdmRZVjdGQQ?auth=c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "software_version": "205-600052",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "web_url": "https://home.nest.com/cameras/CjZWRzdDN0JVNlpmOE9qRWZpem1CQ1Zud251S0hTbk9CSUhnYlFLYTU3eEtKenJ2b2tLX0R6RlESFm9wNVB2NW93NmJ6cUdvMkZQSGUxdEEaNld0Mkl5b2tIR0tKX2FpUVd1SkRnQjc2ejhSWFl3SFFxWXFrSWx2QlpxN1gyeWNqdmRZVjdGQQ?auth=c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "where_id": "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ"
+ }
+ },
+ "smoke_co_alarms": {
+ "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T20:53:05.338Z",
+ "locale": "en-US",
+ "name": "Downstairs",
+ "name_long": "Downstairs Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg",
+ "where_name": "Downstairs"
+ },
+ "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T20:35:50.051Z",
+ "last_manual_test_time": "1970-01-01T00:00:00.000Z",
+ "locale": "en-US",
+ "name": "Upstairs",
+ "name_long": "Upstairs Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA",
+ "where_name": "Upstairs"
+ },
+ "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T11:04:18.804Z",
+ "last_manual_test_time": "1970-01-01T00:00:00.000Z",
+ "locale": "en-US",
+ "name": "Downstairs Kitchen",
+ "name_long": "Downstairs Kitchen Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ",
+ "where_name": "Downstairs Kitchen"
+ },
+ "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T13:30:34.187Z",
+ "last_manual_test_time": "1970-01-01T00:00:00.000Z",
+ "locale": "en-US",
+ "name": "Living Room",
+ "name_long": "Living Room Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw",
+ "where_name": "Living Room"
+ }
+ },
+ "thermostats": {
+ "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV": {
+ "ambient_temperature_c": 19.0,
+ "ambient_temperature_f": 66,
+ "away_temperature_high_c": 24.0,
+ "away_temperature_high_f": 76,
+ "away_temperature_low_c": 12.5,
+ "away_temperature_low_f": 55,
+ "can_cool": false,
+ "can_heat": true,
+ "device_id": "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV",
+ "eco_temperature_high_c": 24.0,
+ "eco_temperature_high_f": 76,
+ "eco_temperature_low_c": 12.5,
+ "eco_temperature_low_f": 55,
+ "fan_timer_active": false,
+ "fan_timer_duration": 15,
+ "fan_timer_timeout": "1970-01-01T00:00:00.000Z",
+ "has_fan": true,
+ "has_leaf": true,
+ "humidity": 25,
+ "hvac_mode": "heat",
+ "hvac_state": "off",
+ "is_locked": false,
+ "is_online": true,
+ "is_using_emergency_heat": false,
+ "label": "Living Room",
+ "last_connection": "2017-02-02T21:00:06.000Z",
+ "locale": "en-GB",
+ "locked_temp_max_c": 22.0,
+ "locked_temp_max_f": 72,
+ "locked_temp_min_c": 20.0,
+ "locked_temp_min_f": 68,
+ "name": "Living Room (Living Room)",
+ "name_long": "Living Room Thermostat (Living Room)",
+ "previous_hvac_mode": "",
+ "software_version": "5.6-7",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "sunlight_correction_active": false,
+ "sunlight_correction_enabled": true,
+ "target_temperature_c": 15.5,
+ "target_temperature_f": 60,
+ "target_temperature_high_c": 24.0,
+ "target_temperature_high_f": 75,
+ "target_temperature_low_c": 20.0,
+ "target_temperature_low_f": 68,
+ "temperature_scale": "C",
+ "time_to_target": "~0",
+ "time_to_target_training": "ready",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw",
+ "where_name": "Living Room"
+ }
+ }
+ },
+ "metadata": {
+ "access_token": "c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "client_version": 1
+ },
+ "structures": {
+ "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A": {
+ "away": "home",
+ "cameras": [
+ "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ",
+ "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ"
+ ],
+ "co_alarm_state": "ok",
+ "country_code": "US",
+ "eta_begin": "2017-02-02T03:10:08.000Z",
+ "name": "Home",
+ "postal_code": "98056",
+ "rhr_enrollment": false,
+ "smoke_alarm_state": "ok",
+ "smoke_co_alarms": [
+ "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV",
+ "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV",
+ "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV",
+ "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV"
+ ],
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "thermostats": [
+ "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV"
+ ],
+ "time_zone": "America/Los_Angeles",
+ "wheres": {
+ "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ": {
+ "name": "Downstairs Kitchen",
+ "where_id": "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ"
+ },
+ "8tH6YiXUAQDZFLD6AgMmQ14Sc5wTG0NxKfabPY0XKrqc47t3uSDZvQ": {
+ "name": "Frog",
+ "where_id": "8tH6YiXUAQDZFLD6AgMmQ14Sc5wTG0NxKfabPY0XKrqc47t3uSDZvQ"
+ },
+ "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ": {
+ "name": "Garage",
+ "where_id": "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIAYVvcpN1cOA": {
+ "name": "Family Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIAYVvcpN1cOA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB2f05cPKRBA": {
+ "name": "Kitchen",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB2f05cPKRBA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB7GULj0y7Rw": {
+ "name": "Hallway",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB7GULj0y7Rw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIYpqdaXnYjUg": {
+ "name": "Basement",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIYpqdaXnYjUg"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIbTUmML4Q6xA": {
+ "name": "Kids Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIbTUmML4Q6xA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIebdVzhA62Iw": {
+ "name": "Master Bedroom",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIebdVzhA62Iw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg": {
+ "name": "Downstairs",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJv12iEHQ0hxA": {
+ "name": "Driveway",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJv12iEHQ0hxA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJyRQEOtmKqkw": {
+ "name": "Den",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJyRQEOtmKqkw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK-nCnEjccnMQ": {
+ "name": "Bedroom",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK-nCnEjccnMQ"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK2kdsXRP3IFg": {
+ "name": "Entryway",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK2kdsXRP3IFg"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA": {
+ "name": "Upstairs",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw": {
+ "name": "Living Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKR8TWb9hTptQ": {
+ "name": "Outside",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKR8TWb9hTptQ"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKZphUIYeW39g": {
+ "name": "Dining Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKZphUIYeW39g"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKfexoqPTcUVA": {
+ "name": "Backyard",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKfexoqPTcUVA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKtUyRb3je64Q": {
+ "name": "Office",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKtUyRb3je64Q"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsLRu9lIioI47g": {
+ "name": "Front Yard",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsLRu9lIioI47g"
+ }
+ },
+ "wwn_security_state": "ok"
+ }
+ }
+}
--- /dev/null
+{
+ "path": "/",
+ "data": {
+ }
+}
--- /dev/null
+{
+ "path": "/",
+ "data": {
+ "devices": {
+ "cameras": {
+ "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ": {
+ "device_id": "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ"
+ },
+ "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ": {
+ "device_id": "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ"
+ }
+ },
+ "smoke_co_alarms": {
+ "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV": {
+ "device_id": "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV"
+ },
+ "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV": {
+ "device_id": "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV"
+ },
+ "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV": {
+ "device_id": "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV"
+ },
+ "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV": {
+ "device_id": "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV"
+ }
+ },
+ "thermostats": {
+ "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV": {
+ "device_id": "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV"
+ },
+ "OTQoylk2h5Ld3cfpm3esR0qx-iQr8PMV": {
+ "device_id": "OTQoylk2h5Ld3cfpm3esR0qx-iQr8PMV"
+ }
+ }
+ },
+ "structures": {
+ "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A": {
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A"
+ },
+ "SylKI7puaWd56ILAcJ46LzmtdZc3L4wGzScs8yLc5zccJofBIW9KTJ": {
+ "structure_id": "SylKI7puaWd56ILAcJ46LzmtdZc3L4wGzScs8yLc5zccJofBIW9KTJ"
+ }
+ }
+ }
+}
--- /dev/null
+{
+ "path": "/",
+ "data": {
+ "devices": {
+ "cameras": {
+ "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ": {
+ "app_url": "https://camera_app_url",
+ "device_id": "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ",
+ "is_audio_input_enabled": true,
+ "is_online": true,
+ "is_public_share_enabled": false,
+ "is_streaming": false,
+ "is_video_history_enabled": false,
+ "last_event": {
+ "activity_zone_ids": [
+ "id1",
+ "id2"
+ ],
+ "animated_image_url": "https://last_event_animated_image_url",
+ "app_url": "https://last_event_app_url",
+ "end_time": "2017-01-22T07:40:38.680Z",
+ "has_motion": true,
+ "has_person": false,
+ "has_sound": false,
+ "image_url": "https://last_event_image_url",
+ "start_time": "2017-01-22T07:40:19.020Z",
+ "urls_expire_time": "2017-02-05T07:40:19.020Z",
+ "web_url": "https://last_event_web_url"
+ },
+ "last_is_online_change": "2017-01-22T08:19:20.000Z",
+ "name": "Upstairs",
+ "name_long": "Upstairs Camera",
+ "public_share_url": "https://camera_public_share_url",
+ "snapshot_url": "https://camera_snapshot_url",
+ "software_version": "205-600052",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "web_url": "https://camera_web_url",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA"
+ },
+ "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ": {
+ "app_url": "nestmobile://cameras/CjZWRzdDN0JVNlpmOE9qRWZpem1CQ1Zud251S0hTbk9CSUhnYlFLYTU3eEtKenJ2b2tLX0R6RlESFm9wNVB2NW93NmJ6cUdvMkZQSGUxdEEaNld0Mkl5b2tIR0tKX2FpUVd1SkRnQjc2ejhSWFl3SFFxWXFrSWx2QlpxN1gyeWNqdmRZVjdGQQ?auth=c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "device_id": "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ",
+ "is_audio_input_enabled": true,
+ "is_online": false,
+ "is_public_share_enabled": false,
+ "is_streaming": false,
+ "is_video_history_enabled": false,
+ "last_event": {
+ "end_time": "2016-11-20T07:02:46.860Z",
+ "has_motion": true,
+ "has_person": false,
+ "has_sound": false,
+ "start_time": "2016-11-20T07:02:27.260Z"
+ },
+ "last_is_online_change": "2016-11-20T07:03:42.000Z",
+ "name": "Garage",
+ "name_long": "Garage Camera",
+ "snapshot_url": "https://www.dropcam.com/api/wwn.get_snapshot/CjZWRzdDN0JVNlpmOE9qRWZpem1CQ1Zud251S0hTbk9CSUhnYlFLYTU3eEtKenJ2b2tLX0R6RlESFm9wNVB2NW93NmJ6cUdvMkZQSGUxdEEaNld0Mkl5b2tIR0tKX2FpUVd1SkRnQjc2ejhSWFl3SFFxWXFrSWx2QlpxN1gyeWNqdmRZVjdGQQ?auth=c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "software_version": "205-600052",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "web_url": "https://home.nest.com/cameras/CjZWRzdDN0JVNlpmOE9qRWZpem1CQ1Zud251S0hTbk9CSUhnYlFLYTU3eEtKenJ2b2tLX0R6RlESFm9wNVB2NW93NmJ6cUdvMkZQSGUxdEEaNld0Mkl5b2tIR0tKX2FpUVd1SkRnQjc2ejhSWFl3SFFxWXFrSWx2QlpxN1gyeWNqdmRZVjdGQQ?auth=c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "where_id": "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ"
+ }
+ },
+ "smoke_co_alarms": {
+ "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T20:53:05.338Z",
+ "last_manual_test_time": "2016-10-31T23:59:59.000Z",
+ "locale": "en-US",
+ "name": "Downstairs",
+ "name_long": "Downstairs Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg",
+ "where_name": "Downstairs"
+ },
+ "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T20:35:50.051Z",
+ "last_manual_test_time": "1970-01-01T00:00:00.000Z",
+ "locale": "en-US",
+ "name": "Upstairs",
+ "name_long": "Upstairs Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA",
+ "where_name": "Upstairs"
+ },
+ "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T11:04:18.804Z",
+ "last_manual_test_time": "1970-01-01T00:00:00.000Z",
+ "locale": "en-US",
+ "name": "Downstairs Kitchen",
+ "name_long": "Downstairs Kitchen Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ",
+ "where_name": "Downstairs Kitchen"
+ },
+ "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV": {
+ "battery_health": "ok",
+ "co_alarm_state": "ok",
+ "device_id": "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV",
+ "is_manual_test_active": false,
+ "is_online": true,
+ "last_connection": "2017-02-02T13:30:34.187Z",
+ "last_manual_test_time": "1970-01-01T00:00:00.000Z",
+ "locale": "en-US",
+ "name": "Living Room",
+ "name_long": "Living Room Nest Protect",
+ "smoke_alarm_state": "ok",
+ "software_version": "3.1rc9",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "ui_color_state": "green",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw",
+ "where_name": "Living Room"
+ }
+ },
+ "thermostats": {
+ "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV": {
+ "ambient_temperature_c": 19.0,
+ "ambient_temperature_f": 66,
+ "away_temperature_high_c": 24.0,
+ "away_temperature_high_f": 76,
+ "away_temperature_low_c": 12.5,
+ "away_temperature_low_f": 55,
+ "can_cool": false,
+ "can_heat": true,
+ "device_id": "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV",
+ "eco_temperature_high_c": 24.0,
+ "eco_temperature_high_f": 76,
+ "eco_temperature_low_c": 12.5,
+ "eco_temperature_low_f": 55,
+ "fan_timer_active": false,
+ "fan_timer_duration": 15,
+ "fan_timer_timeout": "1970-01-01T00:00:00.000Z",
+ "has_fan": true,
+ "has_leaf": true,
+ "humidity": 25,
+ "hvac_mode": "heat",
+ "hvac_state": "off",
+ "is_locked": false,
+ "is_online": true,
+ "is_using_emergency_heat": false,
+ "label": "Living Room",
+ "last_connection": "2017-02-02T21:00:06.000Z",
+ "locale": "en-GB",
+ "locked_temp_max_c": 22.0,
+ "locked_temp_max_f": 72,
+ "locked_temp_min_c": 20.0,
+ "locked_temp_min_f": 68,
+ "name": "Living Room (Living Room)",
+ "name_long": "Living Room Thermostat (Living Room)",
+ "previous_hvac_mode": "",
+ "software_version": "5.6-7",
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "sunlight_correction_active": false,
+ "sunlight_correction_enabled": true,
+ "target_temperature_c": 15.5,
+ "target_temperature_f": 60,
+ "target_temperature_high_c": 24.0,
+ "target_temperature_high_f": 75,
+ "target_temperature_low_c": 20.0,
+ "target_temperature_low_f": 68,
+ "temperature_scale": "C",
+ "time_to_target": "~0",
+ "time_to_target_training": "ready",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw",
+ "where_name": "Living Room"
+ }
+ }
+ },
+ "metadata": {
+ "access_token": "c.eQ5QBBPiFOTNzPHbmZPcE9yPZ7GayzLusifgQR2DQRFNyUS9ESvlhJF0D7vG8Y0TFV39zX1vIOsWrv8RKCMrFepNUb9FqHEboa4MtWLUsGb4tD9oBh0jrV4HooJUmz5sVA5KZR0dkxyLYyPc",
+ "client_version": 1
+ },
+ "structures": {
+ "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A": {
+ "away": "home",
+ "cameras": [
+ "_LK8j9rRXwCKEBOtDo7JskNxzWfHBOIm3CLouCT3FQZzrvokK_DzFQ",
+ "VG7C7BU6Zf8OjEfizmBCVnwnuKHSnOBIHgbQKa57xKJzrvokK_DzFQ"
+ ],
+ "co_alarm_state": "ok",
+ "country_code": "US",
+ "eta_begin": "2017-02-02T03:10:08.000Z",
+ "name": "Home",
+ "peak_period_end_time": "2017-07-01T01:03:08.400Z",
+ "peak_period_start_time": "2017-06-01T13:31:10.870Z",
+ "postal_code": "98056",
+ "rhr_enrollment": false,
+ "smoke_alarm_state": "ok",
+ "smoke_co_alarms": [
+ "p1b1oySOcs-OJHIgmgeMkHOu-iQr8PMV",
+ "p1b1oySOcs8Qu7IAJVrQ7XOu-iQr8PMV",
+ "p1b1oySOcs8W9WwaNu80oXOu-iQr8PMV",
+ "p1b1oySOcs_sbi4iczruW3Ou-iQr8PMV"
+ ],
+ "structure_id": "ysCnsCaq1pQwKUPP9H4AqE943C1XtLin3x6uCVN5Qh09IDyTg7Ey5A",
+ "thermostats": [
+ "G1jouHN5yl6mXFaQw5iGwXOu-iQr8PMV"
+ ],
+ "time_zone": "America/Los_Angeles",
+ "wheres": {
+ "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ": {
+ "name": "Downstairs Kitchen",
+ "where_id": "6UAWzz8czKpFrH6EK3AcjDiTjbRgts8x5MJxEnn1yKKQpYTBO7n2UQ"
+ },
+ "8tH6YiXUAQDZFLD6AgMmQ14Sc5wTG0NxKfabPY0XKrqc47t3uSDZvQ": {
+ "name": "Frog",
+ "where_id": "8tH6YiXUAQDZFLD6AgMmQ14Sc5wTG0NxKfabPY0XKrqc47t3uSDZvQ"
+ },
+ "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ": {
+ "name": "Garage",
+ "where_id": "qpWvTu89Knhn6GRFM-VtGoE4KYwbzbJg9INR6WyPfhW1EJ04GRyYbQ"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIAYVvcpN1cOA": {
+ "name": "Family Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIAYVvcpN1cOA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB2f05cPKRBA": {
+ "name": "Kitchen",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB2f05cPKRBA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB7GULj0y7Rw": {
+ "name": "Hallway",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIB7GULj0y7Rw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIYpqdaXnYjUg": {
+ "name": "Basement",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIYpqdaXnYjUg"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIbTUmML4Q6xA": {
+ "name": "Kids Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIbTUmML4Q6xA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIebdVzhA62Iw": {
+ "name": "Master Bedroom",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIebdVzhA62Iw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg": {
+ "name": "Downstairs",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsIm5E0NfJPeeg"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJv12iEHQ0hxA": {
+ "name": "Driveway",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJv12iEHQ0hxA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJyRQEOtmKqkw": {
+ "name": "Den",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsJyRQEOtmKqkw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK-nCnEjccnMQ": {
+ "name": "Bedroom",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK-nCnEjccnMQ"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK2kdsXRP3IFg": {
+ "name": "Entryway",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsK2kdsXRP3IFg"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA": {
+ "name": "Upstairs",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKCxvyZfxNpKA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw": {
+ "name": "Living Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKQrCrjN0yXiw"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKR8TWb9hTptQ": {
+ "name": "Outside",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKR8TWb9hTptQ"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKZphUIYeW39g": {
+ "name": "Dining Room",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKZphUIYeW39g"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKfexoqPTcUVA": {
+ "name": "Backyard",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKfexoqPTcUVA"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKtUyRb3je64Q": {
+ "name": "Office",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsKtUyRb3je64Q"
+ },
+ "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsLRu9lIioI47g": {
+ "name": "Front Yard",
+ "where_id": "z8fK075vJJPPWnXxLx1m3GskRSZQ64iQydB59k-UPsLRu9lIioI47g"
+ }
+ },
+ "wwn_security_state": "ok"
+ }
+ }
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<classpath>
+ <classpathentry kind="src" output="target/classes" path="src/main/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-11">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="con" path="org.eclipse.m2e.MAVEN2_CLASSPATH_CONTAINER">
+ <attributes>
+ <attribute name="maven.pomderived" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="src" output="target/test-classes" path="src/test/java">
+ <attributes>
+ <attribute name="optional" value="true"/>
+ <attribute name="maven.pomderived" value="true"/>
+ <attribute name="test" value="true"/>
+ </attributes>
+ </classpathentry>
+ <classpathentry kind="output" path="target/classes"/>
+</classpath>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<projectDescription>
+ <name>org.openhab.persistence.mapdb.tests</name>
+ <comment></comment>
+ <projects>
+ </projects>
+ <buildSpec>
+ <buildCommand>
+ <name>org.eclipse.jdt.core.javabuilder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ <buildCommand>
+ <name>org.eclipse.m2e.core.maven2Builder</name>
+ <arguments>
+ </arguments>
+ </buildCommand>
+ </buildSpec>
+ <natures>
+ <nature>org.eclipse.jdt.core.javanature</nature>
+ <nature>org.eclipse.m2e.core.maven2Nature</nature>
+ </natures>
+</projectDescription>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+-include: ../itest-common.bndrun
+
+Bundle-SymbolicName: ${project.artifactId}
+Fragment-Host: org.openhab.persistence.mapdb
+
+-runrequires: \
+ bnd.identity;id='org.openhab.persistence.mapdb.tests',\
+ bnd.identity;id='org.openhab.core',\
+ bnd.identity;id='org.openhab.persistence.mapdb'
+
+#
+# done
+#
+-runbundles: \
+ ch.qos.logback.classic;version='[1.2.3,1.2.4)',\
+ ch.qos.logback.core;version='[1.2.3,1.2.4)',\
+ com.google.gson;version='[2.8.2,2.8.3)',\
+ javax.measure.unit-api;version='[1.0.0,1.0.1)',\
+ org.apache.felix.http.servlet-api;version='[1.1.2,1.1.3)',\
+ org.apache.felix.scr;version='[2.1.10,2.1.11)',\
+ org.eclipse.equinox.event;version='[1.4.300,1.4.301)',\
+ org.openhab.core;version='[3.0.0,3.0.1)',\
+ org.openhab.core.config.core;version='[3.0.0,3.0.1)',\
+ org.openhab.core.persistence;version='[3.0.0,3.0.1)',\
+ org.openhab.core.test;version='[3.0.0,3.0.1)',\
+ org.osgi.service.event;version='[1.4.0,1.4.1)',\
+ slf4j.api;version='[1.7.25,1.7.26)',\
+ org.apache.servicemix.specs.activation-api-1.1;version='[2.9.0,2.9.1)',\
+ org.apache.servicemix.specs.jaxb-api-2.2;version='[2.9.0,2.9.1)',\
+ org.apache.servicemix.specs.stax-api-1.2;version='[2.9.0,2.9.1)',\
+ tec.uom.lib.uom-lib-common;version='[1.0.3,1.0.4)',\
+ tec.uom.se;version='[1.0.10,1.0.11)',\
+ org.apache.servicemix.bundles.jaxb-impl;version='[2.2.11,2.2.12)',\
+ org.eclipse.jetty.http;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.io;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.security;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.server;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.servlet;version='[9.4.20,9.4.21)',\
+ org.eclipse.jetty.util;version='[9.4.20,9.4.21)',\
+ org.openhab.persistence.mapdb;version='[3.0.0,3.0.1)',\
+ org.openhab.persistence.mapdb.tests;version='[3.0.0,3.0.1)',\
+ biz.aQute.tester.junit-platform;version='[5.1.2,5.1.3)',\
+ junit-jupiter-api;version='[5.6.2,5.6.3)',\
+ junit-jupiter-engine;version='[5.6.2,5.6.3)',\
+ junit-platform-commons;version='[1.6.2,1.6.3)',\
+ junit-platform-engine;version='[1.6.2,1.6.3)',\
+ junit-platform-launcher;version='[1.6.2,1.6.3)',\
+ org.hamcrest;version='[2.2.0,2.2.1)',\
+ org.opentest4j;version='[1.2.0,1.2.1)'
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons.itests</groupId>
+ <artifactId>org.openhab.addons.reactor.itests</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <artifactId>org.openhab.persistence.mapdb.tests</artifactId>
+
+ <name>openHAB Add-ons :: Integration Tests :: MapDB Persistence Tests</name>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.addons.bundles</groupId>
+ <artifactId>org.openhab.persistence.mapdb</artifactId>
+ <version>${project.version}</version>
+ </dependency>
+ </dependencies>
+
+</project>
--- /dev/null
+/**
+ * Copyright (c) 2010-2020 Contributors to the openHAB project
+ *
+ * See the NOTICE file(s) distributed with this work for additional
+ * information.
+ *
+ * This program and the accompanying materials are made available under the
+ * terms of the Eclipse Public License 2.0 which is available at
+ * http://www.eclipse.org/legal/epl-2.0
+ *
+ * SPDX-License-Identifier: EPL-2.0
+ */
+package org.openhab.persistence.mapdb;
+
+import static org.hamcrest.CoreMatchers.*;
+import static org.hamcrest.MatcherAssert.assertThat;
+import static org.hamcrest.beans.HasPropertyWithValue.hasProperty;
+import static org.hamcrest.collection.IsEmptyIterable.emptyIterable;
+import static org.hamcrest.collection.IsIterableContainingInOrder.contains;
+
+import java.io.File;
+import java.io.IOException;
+import java.nio.file.Files;
+import java.nio.file.Path;
+import java.nio.file.Paths;
+
+import org.junit.jupiter.api.AfterAll;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.Test;
+import org.openhab.core.items.GenericItem;
+import org.openhab.core.library.items.ColorItem;
+import org.openhab.core.library.items.DimmerItem;
+import org.openhab.core.library.items.SwitchItem;
+import org.openhab.core.library.types.HSBType;
+import org.openhab.core.library.types.OnOffType;
+import org.openhab.core.library.types.PercentType;
+import org.openhab.core.persistence.FilterCriteria;
+import org.openhab.core.persistence.QueryablePersistenceService;
+import org.openhab.core.test.java.JavaOSGiTest;
+import org.openhab.core.types.State;
+import org.openhab.persistence.mapdb.internal.MapDbPersistenceService;
+
+/**
+ *
+ * @author Martin Kühl - Initial contribution
+ */
+public class MapDbPersistenceServiceOSGiTest extends JavaOSGiTest {
+ private MapDbPersistenceService persistenceService;
+
+ @BeforeEach
+ public void setUp() {
+ persistenceService = getService(QueryablePersistenceService.class, MapDbPersistenceService.class);
+ }
+
+ @AfterAll
+ public static void tearDown() throws IOException {
+ // clean up database files ...
+ removeDirRecursive("userdata");
+ removeDirRecursive("runtime");
+ }
+
+ private static void removeDirRecursive(final String dir) throws IOException {
+ final Path path = Paths.get(dir);
+ if (Files.exists(path)) {
+ Files.walk(path).map(Path::toFile).sorted().forEach(File::delete);
+ }
+ }
+
+ @Test
+ public void storeShouldStoreTheItem() {
+ String name = "switch1";
+ String alias = "switch2";
+ State state = OnOffType.ON;
+
+ GenericItem item = new SwitchItem(name);
+ item.setState(state);
+
+ assertThat(persistenceService.getItemInfo(), not(hasItem(hasProperty("name", equalTo(name)))));
+
+ persistenceService.store(item);
+
+ assertThat(persistenceService.getItemInfo(), hasItem(hasProperty("name", equalTo(name))));
+
+ persistenceService.store(item, alias);
+
+ assertThat(persistenceService.getItemInfo(),
+ hasItems(hasProperty("name", equalTo(name)), hasProperty("name", equalTo(alias))));
+ }
+
+ @Test
+ public void queryShouldFindStoredItemsByName() {
+ String name = "dimmer";
+ State state = PercentType.HUNDRED;
+
+ GenericItem item = new DimmerItem(name);
+ item.setState(state);
+
+ FilterCriteria filter = new FilterCriteria();
+ filter.setItemName(name);
+
+ assertThat(persistenceService.query(filter), is(emptyIterable()));
+
+ persistenceService.store(item);
+
+ assertThat(persistenceService.query(filter),
+ contains(allOf(hasProperty("name", equalTo(name)), hasProperty("state", equalTo(state)))));
+ }
+
+ @Test
+ public void queryShouldFindStoredItemsByAlias() {
+ String name = "color";
+ String alias = "alias";
+ State state = HSBType.GREEN;
+
+ GenericItem item = new ColorItem(name);
+ item.setState(state);
+
+ FilterCriteria filterByName = new FilterCriteria();
+ filterByName.setItemName(name);
+
+ FilterCriteria filterByAlias = new FilterCriteria();
+ filterByAlias.setItemName(alias);
+
+ assertThat(persistenceService.query(filterByName), is(emptyIterable()));
+ assertThat(persistenceService.query(filterByAlias), is(emptyIterable()));
+
+ persistenceService.store(item, alias);
+
+ assertThat(persistenceService.query(filterByName), is(emptyIterable()));
+ assertThat(persistenceService.query(filterByAlias),
+ contains(allOf(hasProperty("name", equalTo(alias)), hasProperty("state", equalTo(state)))));
+ }
+}
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab.addons</groupId>
+ <artifactId>org.openhab.addons.reactor</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+ </parent>
+
+ <groupId>org.openhab.addons.itests</groupId>
+ <artifactId>org.openhab.addons.reactor.itests</artifactId>
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons :: Integration Tests</name>
+
+ <modules>
+ <module>org.openhab.binding.nest.tests</module>
+ <module>org.openhab.persistence.mapdb.tests</module>
+ </modules>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.openhab-core</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.compile</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <configuration>
+ <archive>
+ <manifestFile>${project.build.outputDirectory}/META-INF/MANIFEST.MF</manifestFile>
+ </archive>
+ <skipIfEmpty>true</skipIfEmpty>
+ </configuration>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ </plugin>
+ </plugins>
+ </build>
+
+ <profiles>
+ <profile>
+ <!-- BEG: itests common -->
+ <id>itests-common</id>
+ <activation>
+ <file>
+ <exists>itest.bndrun</exists>
+ </file>
+ </activation>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.test</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.test-index</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.openhab-core-index</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.runtime-index</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ <configuration>
+ <bndfile>itest.bndrun</bndfile>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-indexer-maven-plugin</artifactId>
+ <configuration>
+ <includeJar>true</includeJar>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-testing-maven-plugin</artifactId>
+ <configuration>
+ <bndruns>
+ <bndrun>itest.bndrun</bndrun>
+ </bndruns>
+ </configuration>
+ </plugin>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-resolver-maven-plugin</artifactId>
+ <configuration>
+ <bndruns>
+ <bndrun>itest.bndrun</bndrun>
+ </bndruns>
+ </configuration>
+ </plugin>
+ </plugins>
+ </build>
+
+ </profile>
+ <!-- END: itests common -->
+ </profiles>
+
+</project>
--- /dev/null
+Copyright (c) 2010-${year} Contributors to the openHAB project
+
+See the NOTICE file(s) distributed with this work for additional
+information.
+
+This program and the accompanying materials are made available under the
+terms of the Eclipse Public License 2.0 which is available at
+http://www.eclipse.org/legal/epl-2.0
+
+SPDX-License-Identifier: EPL-2.0
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<additionalHeaders>
+ <xml-header-style>
+ <firstLine><![CDATA[<!--EOL]]></firstLine>
+ <beforeEachLine>	</beforeEachLine>
+ <endLine><![CDATA[EOL-->]]></endLine>
+ <skipLine><![CDATA[^<\?xml.*>$]]></skipLine>
+ <firstLineDetectionPattern><![CDATA[(\s|\t)*<!--.*$]]></firstLineDetectionPattern>
+ <lastLineDetectionPattern><![CDATA[.*-->(\s|\t)*$]]></lastLineDetectionPattern>
+ <allowBlankLines>true</allowBlankLines>
+ <isMultiline>true</isMultiline>
+ </xml-header-style>
+</additionalHeaders>
--- /dev/null
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+ <modelVersion>4.0.0</modelVersion>
+
+ <parent>
+ <groupId>org.openhab</groupId>
+ <artifactId>openhab-super-pom</artifactId>
+ <version>[1.0, 2.0)</version>
+ </parent>
+
+ <groupId>org.openhab.addons</groupId>
+ <artifactId>org.openhab.addons.reactor</artifactId>
+ <version>3.0.0-SNAPSHOT</version>
+
+ <packaging>pom</packaging>
+
+ <name>openHAB Add-ons</name>
+ <description>This project contains the official add-ons of openHAB</description>
+
+ <organization>
+ <name>openHAB.org</name>
+ <url>http://www.openhab.org</url>
+ </organization>
+
+ <licenses>
+ <license>
+ <name>Eclipse Public License 2.0</name>
+ <url>https://www.eclipse.org/legal/epl-2.0/</url>
+ </license>
+ </licenses>
+
+ <modules>
+ <module>bom</module>
+ <module>bundles</module>
+ <module>features</module>
+ <module>itests</module>
+ </modules>
+
+ <scm>
+ <connection>scm:git:https://github.com/openhab/openhab-addons.git</connection>
+ <developerConnection>scm:git:https://github.com/openhab/openhab-addons.git</developerConnection>
+ <tag>HEAD</tag>
+ <url>https://github.com/openhab/openhab-addons</url>
+ </scm>
+
+ <issueManagement>
+ <system>GitHub</system>
+ <url>https://github.com/openhab/openhab-addons/issues</url>
+ </issueManagement>
+
+ <distributionManagement>
+ <repository>
+ <id>bintray</id>
+ <url>${oh.repo.distBaseUrl}/openhab-addons/;publish=1</url>
+ </repository>
+ <snapshotRepository>
+ <id>jfrog</id>
+ <url>${oh.repo.snapshotBaseUrl}/libs-snapshot-local</url>
+ </snapshotRepository>
+ </distributionManagement>
+
+ <properties>
+ <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
+ <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
+ <oh.java.version>11</oh.java.version>
+ <maven.compiler.source>${oh.java.version}</maven.compiler.source>
+ <maven.compiler.target>${oh.java.version}</maven.compiler.target>
+ <maven.compiler.compilerVersion>${oh.java.version}</maven.compiler.compilerVersion>
+
+ <ohc.version>${project.version}</ohc.version>
+ <bnd.version>5.1.2</bnd.version>
+ <karaf.version>4.2.7</karaf.version>
+ <sat.version>0.10.0</sat.version>
+ <slf4j.version>1.7.21</slf4j.version>
+
+ <bnd.importpackage/>
+ <bnd.exportpackage/>
+ <bnd.includeresource>-${.}/NOTICE, -${.}/*.xsd</bnd.includeresource>
+
+ <feature.directory>src/main/feature/feature.xml</feature.directory>
+ <spotless.version>2.0.3</spotless.version>
+ </properties>
+
+ <dependencyManagement>
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.compile</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.compile-model</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.openhab-core</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>provided</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.runtime</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>runtime</scope>
+ </dependency>
+ <dependency>
+ <groupId>org.openhab.core.bom</groupId>
+ <artifactId>org.openhab.core.bom.test</artifactId>
+ <version>${ohc.version}</version>
+ <type>pom</type>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+ </dependencyManagement>
+
+ <build>
+ <pluginManagement>
+ <plugins>
+
+ <!-- BEG: bnd -->
+
+ <!-- Use the bnd-maven-plugin and assemble the symbolic names -->
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <configuration>
+ <bnd><![CDATA[Bundle-SymbolicName: ${project.artifactId}
+Automatic-Module-Name: ${def;bsn}
+Import-Package: \\
+ io.swagger.v3.oas.annotations.*;resolution:=optional,\\
+ javax.annotation.security.*;resolution:=optional,\\
+ org.eclipse.jdt.annotation.*;resolution:=optional,\\
+ org.openhab.core.automation.annotation.*;resolution:=optional;version=!,\\
+ org.openhab.*;version=!,\\
+ com.google.common.*;version="14.0",\\
+ ${bnd.importpackage},\\
+ *
+-exportcontents: \\
+ !*.internal.*,\\
+ !*.impl.*, \\
+ org.openhab.*, \\
+ ${bnd.exportpackage}
+-sources: false
+-contract: *
+-includeresource: ${bnd.includeresource}]]></bnd>
+ <!-- -dsannotations-options: norequirements -->
+ <!-- Bundle-SymbolicName: ${project.groupId}.${project.artifactId} -->
+ </configuration>
+ <executions>
+ <execution>
+ <goals>
+ <goal>bnd-process</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+ <!-- Required to make the maven-jar-plugin pick up the bnd generated manifest. Also avoid packaging empty Jars -->
+ <!-- Moved... -->
+
+ <!-- Setup the indexer for running and testing -->
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-indexer-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <configuration>
+ <localURLs>REQUIRED</localURLs>
+ <attach>false</attach>
+ </configuration>
+ <executions>
+ <execution>
+ <id>index</id>
+ <goals>
+ <goal>index</goal>
+ </goals>
+ <configuration>
+ <indexName>${project.artifactId}</indexName>
+ </configuration>
+ </execution>
+ <execution>
+ <id>test-index</id>
+ <goals>
+ <goal>index</goal>
+ </goals>
+ <configuration>
+ <indexName>${project.artifactId}</indexName>
+ <outputFile>${project.build.directory}/test-index.xml</outputFile>
+ <scopes>
+ <scope>test</scope>
+ </scopes>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+
+ <!-- Define the version of the resolver plugin we use -->
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-resolver-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <configuration>
+ <failOnChanges>false</failOnChanges>
+ <bndruns/>
+ </configuration>
+ <executions>
+ <execution>
+ <goals>
+ <goal>resolve</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+
+ <!-- Define the version of the export plugin we use -->
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-export-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <configuration>
+ <resolve>true</resolve>
+ <failOnChanges>true</failOnChanges>
+ </configuration>
+ <executions>
+ <execution>
+ <goals>
+ <goal>export</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+
+ <!-- Define the version of the testing plugin that we use -->
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-testing-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <executions>
+ <execution>
+ <goals>
+ <goal>testing</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+
+ <!-- Define the version of the baseline plugin we use and avoid failing when no baseline jar exists. -->
+ <!-- (for example before the first release) -->
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-baseline-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <configuration>
+ <failOnMissing>false</failOnMissing>
+ </configuration>
+ <executions>
+ <execution>
+ <goals>
+ <goal>baseline</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+
+ <!-- END: bnd -->
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-clean-plugin</artifactId>
+ <version>3.0.0</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-compiler-plugin</artifactId>
+ <version>3.8.0</version>
+ <configuration>
+ <compilerId>eclipse</compilerId>
+ <compilerArgs>
+ <arg>-err:+nullAnnot(org.eclipse.jdt.annotation.Nullable|org.eclipse.jdt.annotation.NonNull|org.eclipse.jdt.annotation.NonNullByDefault),+inheritNullAnnot,-nullUncheckedConversion</arg>
+ <arg>-warn:+null,+inheritNullAnnot,+nullAnnotConflict,-nullUncheckedConversion,+nullAnnotRedundant,+nullDereference</arg>
+ </compilerArgs>
+ <showWarnings>true</showWarnings>
+ <showDeprecation>true</showDeprecation>
+ </configuration>
+ <dependencies>
+ <dependency>
+ <groupId>org.codehaus.plexus</groupId>
+ <artifactId>plexus-compiler-eclipse</artifactId>
+ <version>2.8.5</version>
+ </dependency>
+ <dependency>
+ <groupId>org.eclipse.jdt</groupId>
+ <artifactId>ecj</artifactId>
+ <version>3.16.0</version>
+ </dependency>
+ </dependencies>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-enforcer-plugin</artifactId>
+ <version>3.0.0-M2</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-install-plugin</artifactId>
+ <version>2.5.2</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <version>3.0.2</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-javadoc-plugin</artifactId>
+ <version>2.10.3</version>
+ <configuration>
+ <failOnError>!${quality.skip}</failOnError>
+ </configuration>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-plugin-plugin</artifactId>
+ <version>3.6.0</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-release-plugin</artifactId>
+ <version>2.5.2</version>
+ <configuration>
+ <preparationGoals>clean install</preparationGoals>
+ </configuration>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-resources-plugin</artifactId>
+ <version>3.0.2</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-site-plugin</artifactId>
+ <version>3.7.1</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-source-plugin</artifactId>
+ <version>3.0.1</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-surefire-plugin</artifactId>
+ <version>3.0.0-M5</version>
+ </plugin>
+
+ <plugin>
+ <groupId>org.codehaus.mojo</groupId>
+ <artifactId>build-helper-maven-plugin</artifactId>
+ <version>3.0.0</version>
+ </plugin>
+
+ <plugin>
+ <groupId>com.mycila</groupId>
+ <artifactId>license-maven-plugin</artifactId>
+ <version>3.0</version>
+ <configuration>
+ <basedir>${basedir}</basedir>
+ <header>licenses/epl-2.0/header.txt</header>
+ <quiet>false</quiet>
+ <failIfMissing>true</failIfMissing>
+ <strictCheck>true</strictCheck>
+ <aggregate>true</aggregate>
+ <mapping>
+ <xml>xml-header-style</xml>
+ </mapping>
+ <headerDefinitions>
+ <headerDefinition>licenses/epl-2.0/xml-header-style.xml</headerDefinition>
+ </headerDefinitions>
+ <includes>
+ <include>**/org/openhab/**/*.java</include>
+ <include>**/features/**/header.xml</include>
+ </includes>
+ <excludes>
+ <exclude>target/**</exclude>
+ <exclude>**/pom.xml</exclude>
+ <exclude>_*.java</exclude>
+ </excludes>
+ <useDefaultExcludes>true</useDefaultExcludes>
+ <properties>
+ <year>2020</year>
+ </properties>
+ <encoding>UTF-8</encoding>
+ </configuration>
+ <executions>
+ <execution>
+ <goals>
+ <goal>check</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+
+ <!-- This plugin's configuration is used to store Eclipse m2e settings only. -->
+ <!-- It has no influence on the Maven build itself. -->
+ <plugin>
+ <groupId>org.eclipse.m2e</groupId>
+ <artifactId>lifecycle-mapping</artifactId>
+ <version>1.0.0</version>
+ <configuration>
+ <lifecycleMappingMetadata>
+ <pluginExecutions>
+ <pluginExecution>
+ <pluginExecutionFilter>
+ <groupId>org.apache.karaf.tooling</groupId>
+ <artifactId>karaf-maven-plugin</artifactId>
+ <versionRange>[4.2.1,)</versionRange>
+ <goals>
+ <goal>features-generate-descriptor</goal>
+ <goal>verify</goal>
+ </goals>
+ </pluginExecutionFilter>
+ <action>
+ <ignore/>
+ </action>
+ </pluginExecution>
+ <pluginExecution>
+ <pluginExecutionFilter>
+ <groupId>org.codehaus.mojo</groupId>
+ <artifactId>exec-maven-plugin</artifactId>
+ <versionRange>[1.4.0,)</versionRange>
+ <goals>
+ <goal>java</goal>
+ </goals>
+ </pluginExecutionFilter>
+ <action>
+ <ignore/>
+ </action>
+ </pluginExecution>
+ <pluginExecution>
+ <pluginExecutionFilter>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-indexer-maven-plugin</artifactId>
+ <versionRange>[3.1.0,)</versionRange>
+ <goals>
+ <goal>index</goal>
+ <goal>local-index</goal>
+ </goals>
+ </pluginExecutionFilter>
+ <action>
+ <ignore/>
+ </action>
+ </pluginExecution>
+ <pluginExecution>
+ <pluginExecutionFilter>
+ <groupId>org.commonjava.maven.plugins</groupId>
+ <artifactId>directory-maven-plugin</artifactId>
+ <versionRange>[0.3.1,)</versionRange>
+ <goals>
+ <goal>directory-of</goal>
+ </goals>
+ </pluginExecutionFilter>
+ <action>
+ <ignore/>
+ </action>
+ </pluginExecution>
+ <pluginExecution>
+ <pluginExecutionFilter>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-dependency-plugin</artifactId>
+ <versionRange>[3.0.0,)</versionRange>
+ <goals>
+ <goal>unpack</goal>
+ <goal>unpack-dependencies</goal>
+ </goals>
+ </pluginExecutionFilter>
+ <action>
+ <execute/>
+ </action>
+ </pluginExecution>
+ </pluginExecutions>
+ </lifecycleMappingMetadata>
+ </configuration>
+ </plugin>
+
+ <plugin>
+ <groupId>org.openhab.tools.sat</groupId>
+ <artifactId>sat-plugin</artifactId>
+ <version>${sat.version}</version>
+ <configuration>
+ <checkstyleProperties>${basedirRoot}/tools/static-code-analysis/checkstyle/ruleset.properties</checkstyleProperties>
+ <checkstyleFilter>${basedirRoot}/tools/static-code-analysis/checkstyle/suppressions.xml</checkstyleFilter>
+ </configuration>
+ <executions>
+ <execution>
+ <id>sat-all</id>
+ <goals>
+ <goal>checkstyle</goal>
+ <goal>pmd</goal>
+ <goal>spotbugs</goal>
+ <goal>report</goal>
+ </goals>
+ <phase>verify</phase>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>com.diffplug.spotless</groupId>
+ <artifactId>spotless-maven-plugin</artifactId>
+ <version>${spotless.version}</version>
+ <configuration>
+ <java>
+ <eclipse>
+ <file>openhab_codestyle.xml</file>
+ <version>4.13.0</version>
+ </eclipse>
+ <removeUnusedImports/>
+ <importOrder>
+ <file>openhab.importorder</file>
+ </importOrder>
+ <endWithNewline/>
+ </java>
+ <formats>
+ <format>
+ <!-- *.xml -->
+ <includes>
+ <include>src/**/*.xml</include>
+ </includes>
+ <excludes>
+ <exclude>**/pom.xml</exclude>
+ <exclude>**/feature.xml</exclude>
+ <exclude>src/main/history/**/*.xml</exclude>
+ <exclude>features/openhab-addons/src/main/resources/header.xml</exclude>
+ <exclude>features/openhab-addons/src/main/resources/footer.xml</exclude>
+ <exclude>src/main/resources/input/rss*.xml</exclude>
+ <exclude>src/test/resources/**/*.xml</exclude>
+ </excludes>
+ <eclipseWtp>
+ <type>XML</type>
+ <files>
+ <file>openhab_wst_xml_files.prefs</file>
+ </files>
+ <version>4.13.0</version>
+ </eclipseWtp>
+ <trimTrailingWhitespace/>
+ <endWithNewline/>
+ </format>
+ <format>
+ <!-- feature.xml -->
+ <includes>
+ <include>src/main/feature/feature.xml</include>
+ </includes>
+ <eclipseWtp>
+ <type>XML</type>
+ <files>
+ <file>openhab_wst_feature_file.prefs</file>
+ </files>
+ <version>4.13.0</version>
+ </eclipseWtp>
+ <trimTrailingWhitespace/>
+ <endWithNewline/>
+ </format>
+ <format>
+ <!-- pom.xml -->
+ <includes>
+ <include>pom.xml</include>
+ </includes>
+ <eclipseWtp>
+ <type>XML</type>
+ <files>
+ <file>openhab_wst_pom_file.prefs</file>
+ </files>
+ <version>4.13.0</version>
+ </eclipseWtp>
+ <trimTrailingWhitespace/>
+ <endWithNewline/>
+ </format>
+ </formats>
+ </configuration>
+ <dependencies>
+ <dependency>
+ <groupId>org.openhab.tools</groupId>
+ <artifactId>openhab-codestyle</artifactId>
+ <version>${sat.version}</version>
+ </dependency>
+ </dependencies>
+ <executions>
+ <execution>
+ <id>codestyle_check</id>
+ <goals>
+ <goal>check</goal>
+ </goals>
+ <phase>initialize</phase>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+
+ <plugins>
+ <plugin>
+ <groupId>org.commonjava.maven.plugins</groupId>
+ <artifactId>directory-maven-plugin</artifactId>
+ <version>0.3.1</version>
+ <executions>
+ <execution>
+ <id>directories</id>
+ <goals>
+ <goal>directory-of</goal>
+ </goals>
+ <phase>initialize</phase>
+ <configuration>
+ <property>basedirRoot</property>
+ <project>
+ <groupId>org.openhab.addons</groupId>
+ <artifactId>org.openhab.addons.reactor</artifactId>
+ </project>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-enforcer-plugin</artifactId>
+ <executions>
+ <execution>
+ <id>enforce-java</id>
+ <goals>
+ <goal>enforce</goal>
+ </goals>
+ <configuration>
+ <rules>
+ <requireJavaVersion>
+ <version>[11.0,12.0)</version>
+ </requireJavaVersion>
+ </rules>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ <plugin>
+ <groupId>com.diffplug.spotless</groupId>
+ <artifactId>spotless-maven-plugin</artifactId>
+ </plugin>
+ </plugins>
+ <extensions>
+ <extension>
+ <groupId>org.openhab.tools.sat</groupId>
+ <artifactId>sat-extension</artifactId>
+ <version>${sat.version}</version>
+ </extension>
+ </extensions>
+ </build>
+
+ <profiles>
+ <profile>
+ <id>skip-check</id>
+ <activation>
+ <property>
+ <name>skipChecks</name>
+ </property>
+ </activation>
+ <build>
+ <pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>org.openhab.tools.sat</groupId>
+ <artifactId>sat-plugin</artifactId>
+ <version>${sat.version}</version>
+ <executions>
+ <execution>
+ <id>sat-all</id>
+ <phase>none</phase>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+ </build>
+ </profile>
+ <profile>
+ <id>check-bundles</id>
+ <activation>
+ <file>
+ <exists>src</exists>
+ </file>
+ </activation>
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.openhab.tools.sat</groupId>
+ <artifactId>sat-plugin</artifactId>
+ </plugin>
+ </plugins>
+ </build>
+ </profile>
+ <profile>
+ <id>add-libraries-cli</id>
+ <activation>
+ <property>
+ <name>!m2e.version</name>
+ </property>
+ <file>
+ <exists>lib/</exists>
+ </file>
+ </activation>
+ <properties>
+ <bnd.includeresource>-${.}/../../NOTICE, -${.}/../../*.xsd, ${.}/../../lib/;filter:=*.jar;lib:=true</bnd.includeresource>
+ <feature.directory>../../src/main/feature/feature.xml</feature.directory>
+ </properties>
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>com.googlecode.addjars-maven-plugin</groupId>
+ <artifactId>addjars-maven-plugin</artifactId>
+ <version>1.0.5</version>
+ <executions>
+ <execution>
+ <goals>
+ <goal>add-jars</goal>
+ </goals>
+ <configuration>
+ <resources>
+ <resource>
+ <directory>${project.basedir}/lib</directory>
+ <includes>
+ <include>**/*.jar</include>
+ </includes>
+ <scope>provided</scope>
+ </resource>
+ </resources>
+ </configuration>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </build>
+ </profile>
+ <profile>
+ <id>add-libraries-eclipse</id>
+ <activation>
+ <property>
+ <name>m2e.version</name>
+ </property>
+ <file>
+ <exists>lib/</exists>
+ </file>
+ </activation>
+ <properties>
+ <bnd.includeresource>-${.}/NOTICE, -${.}/*.xsd, ${.}/lib/;filter:=*.jar;lib:=true</bnd.includeresource>
+ </properties>
+ </profile>
+ <profile>
+ <id>with-bnd-resolver-resolve</id>
+ <activation>
+ <property>
+ <name>withResolver</name>
+ </property>
+ </activation>
+ <build>
+ <pluginManagement>
+ <plugins>
+ <plugin>
+ <groupId>biz.aQute.bnd</groupId>
+ <artifactId>bnd-resolver-maven-plugin</artifactId>
+ <version>${bnd.version}</version>
+ <executions>
+ <execution>
+ <goals>
+ <goal>resolve</goal>
+ </goals>
+ <phase>package</phase>
+ </execution>
+ </executions>
+ </plugin>
+ </plugins>
+ </pluginManagement>
+ </build>
+ </profile>
+ </profiles>
+
+</project>
--- /dev/null
+This content is produced and maintained by the openHAB project.
+
+* Project home: https://www.openhab.org
+
+== Declared Project Licenses
+
+This program and the accompanying materials are made available under the terms
+of the Eclipse Public License 2.0 which is available at
+https://www.eclipse.org/legal/epl-2.0/.
+
+== Source Code
+
+https://github.com/openhab/openhab-addons
--- /dev/null
+checkstyle.headerCheck.content=^/\\*\\*$\\n^ \\* Copyright \\(c\\) {0}-{1} Contributors to the openHAB project$\\n^ \\*$\\n^ \\* See the NOTICE file\\(s\\) distributed with this work for additional$\\n^ \\* information.$\\n^ \\*$\\n^ \\* This program and the accompanying materials are made available under the$\\n^ \\* terms of the Eclipse Public License 2\\.0 which is available at$\\n^ \\* http://www.eclipse.org/legal/epl\\-2\\.0$\\n^ \\*$\\n^ \\* SPDX-License-Identifier: EPL-2.0$
+checkstyle.headerCheck.values=2010,2020
+checkstyle.forbiddenPackageUsageCheck.forbiddenPackages=com.google.common,gnu.io,javax.comm,org.apache.commons,org.joda.time
+checkstyle.forbiddenPackageUsageCheck.exceptions=
+checkstyle.requiredFilesCheck.files=pom.xml
--- /dev/null
+<?xml version="1.0" encoding="UTF-8"?>
+<!DOCTYPE suppressions PUBLIC "-//Puppy Crawl//DTD Suppressions 1.1//EN" "http://www.puppycrawl.com/dtds/suppressions_1_1.dtd">
+<suppressions>
+ <!-- These suppressions define which files to be suppressed for which checks. -->
+ <suppress files=".+[\\/]internal[\\/].+\.java" checks="JavadocType|JavadocVariable|JavadocMethod|MissingJavadocFilterCheck"/>
+ <suppress files=".+DTO\.java" checks="JavadocType|JavadocVariable|JavadocMethod|MissingJavadocFilterCheck|NullAnnotationsCheck" />
+ <suppress files=".+Impl\.java" checks="JavadocType|JavadocVariable|JavadocMethod|MissingJavadocFilterCheck"/>
+
+ <!-- Homematic and Tellstick bindings are creating and configuring things dynamically, they will log false positives for unused configuration -->
+ <!-- IO and Voice bundles have specific use cases - they use only the config .xml files and will log false positives as well -->
+ <suppress files=".+org.openhab.binding.homematic.+|.+org.openhab.binding.tellstick.+|.+org.openhab.io.+|.+org.openhab.voice.+" checks="EshInfXmlCheck"/>
+ <!-- Some checks will be supressed for test bundles -->
+ <suppress files=".+.test[\\/].+" checks="RequireBundleCheck"/>
+ <!-- There is a single class inside org.openhab.voice.voicerss.tool, which is meant to be called from the command line.
+ Moving it to "internal" is also not ideal as it is a documented tool. -->
+ <suppress files=".+org.openhab.voice.voicerss.+" checks="PackageExportsNameCheck"/>
+ <!-- Allow the usage of scheduleAtFixedRate in FadingWiFiLEDDriver class -->
+ <suppress files=".+org.openhab.binding.wifiled.handler.FadingWiFiLEDDriver.java" checks="AvoidScheduleAtFixedRateCheck"/>
+ <suppress files=".+[\\/]pom\.xml" checks="OnlyTabIndentationCheck|OnlyTabIndentationInXmlFilesCheck"/>
+ <suppress files=".+org.openhab.binding.yeelight.+" checks="OutsideOfLibExternalLibrariesCheck" />
+ <!-- suppress header checks for imported and patched apache commons-io files in logreader binding -->
+ <suppress files=".+org.openhab.binding.logreader.internal.thirdparty.commonsio.+" checks="ParameterizedRegexpHeaderCheck|AuthorTagCheck" />
+</suppressions>
--- /dev/null
+## Travis tests have failed
+Hey @{{pullRequestAuthor}},
+please read the following log in order to understand the failure reason. There might also be some helpful tips along the way.
+It will be awesome if you fix what is wrong and commit the changes.
+
+{{#jobs}}
+### {{displayName}}
+{{#scripts}}
+<details>
+ <summary>
+ <strong>
+ Expand here
+ </strong>
+ </summary>
+
+```
+{{&contents}}
+```
+</details>
+<br />
+{{/scripts}}
+{{/jobs}}
--- /dev/null
+## Travis tests were successful
+Hey @{{pullRequestAuthor}},
+we found no major flaws with your code. Still you might want to look at this logfile, as we usually suggest some optional improvements.
+
+{{#jobs}}
+### {{displayName}}
+{{#scripts}}
+<details>
+ <summary>
+ <strong>
+ {{command}}
+ </strong>
+ </summary>
+
+```
+{{&contents}}
+```
+</details>
+<br />
+{{/scripts}}
+{{/jobs}}