Reduced chi-squared test fit goodness

#1
In Origin Pro 9, I made 2 linear fits (y = kx + b). For example, I chose equation y = x + 10.
In one case, I set experimental following points:
x =1, y= 11
x = 2, y = 12
x = 3, y = 12.8
x = 4, y = 14.1
x = 5, y = 15
x = 6, y = 16
x = 7, y = 17.3.
In another case, I set:
x = 1, y = 10
x = 2, y = 12
x = 3, y = 13.2
x = 4, y = 13.7
x = 5, y = 15
x = 6, y = 15.5
x = 7, y = 18.
In the first case, fit is near-ideal, in the second case - relatively bad fit.
I didn't set any errors.
I used Origin's default linear fit (Analysis -> Fitting -> Linear fit) and ticked 'Reduced Chi-Sqr' in 'Quantities to Compute', since I have read that reduced chi-squared is a measure of goodness of fit. In the first case, reduced chi-squared is approximately 0.02 (when experimental points lie well on the fit line), and in the second case, reduced chi-squared is approximately 0.3 (when experimental points are more scattered). I have read that if data are well described by model function, then reduced chi-squared must be equal to 1. Thus I have 2 questions: 1) how does Origin calculate reduced chi-squared, given that I didn't provide any errors. As I know, formula for chi-squared (and thus for reduced chi-squared) includes errors of experimental data. For chi-squared quantity, formula is: sum[(y_n-f(x_n,p))/sigma_n]^2, where sum is taken over n from 1 to N, N is number of experimental points, y_n - experimental data, f(x_n,p) - model function, x_n - positions (or time), p - fitting parameters, sigma_n - errors. To obtain reduced chi-squared value, one should divide chi-squared value by number of degrees of freedom, which is equal to (N-k), where k is number of fitting parameters. 2) If we suppose that Origin somehow calculates true reduced chi-squared value, why obtained values are far from 1?