Write a program that calculates the amount a person would earn over a period of time if his or her salary is one penny the first day, two pennies the second day, and continues to double each day. The program should display a table showing the salary for each day, and then show the total pay at the end of the period. The output should be displayed in a dollar amount, not the number of pennies.
Input Validation: Do not accept a number less than 1 for the number of days worked
Breaking it down
Unit tests
Output
Level Up
What happens if the number of days provided is greater than the size that the data type can hold? Meaning, double data type is a double-precision 64-bit IEEE 754 floating point, what other data type could you use?