An object representing a histogram of data values of the same type of unit.
- iOS 13.0+Beta
- Mac Catalyst 13.0+Beta
A histogram measures the number of times a data point for a variable falls into a specific range of possible values within a set of data. Usually, histograms are depicted as bar charts, in which the each bar represents a range of values, and the height of each bar represents the number of times the value of the variable falls within a particular range. In this class, each bar is represented by a bucket.