What are the hardware requirements and design considerations for using the Apollo1 in a temp sensor application? Can you provide an example?
In this example, the hardware interface circuit is a thermistor, a 10 kΩ reference resistor, and a 0.1 µF capacitor. The circuit performs a measurement by charging the capacitor to approximately VCC, then discharging it through the reference resistor, while counting the number of internal clock cycles it takes until the CIN input goes low. The capacitor is charged to near VCC again and then discharged through the thermistor, counting the internal clock cycles required. The unknown resistance value of the thermistor can then be determined by taking a ratio of clock cycles required to discharge the capacitor via the thermistor, versus the number required to discharge via the known reference resistor value then multiplying the result by the value of the reference resistor. Software routines calculate the actual value of the thermistor , equate the value to a corresponding temperature, and convert it to degrees Fahrenheit.
Charging Waveform on CIN
① 20℃--45℃, 0.1℃ resolution
② Resistor range [120 KOhm, 44.69 KOhm]
③ 0.1℃ = 325 Ohm assuming ideal linearity
④ Acceptable error [0.26% , 0.73%]
⑤ 0.325/126 = 0.26%; 0.325/44.69 = 0.73%
Data test record and % error at around ~0.1% with data filtering in SW