- Optimisation interface now matches that of the
Optim.jl
package. - Added support for heteroscedastic noise in the exact inference case.
- Added autodifferentiation support for kernels with fixed parameters.
- Bug fixes in elliptical slice sampler.
- Introduced an alternative sampling method to Hamiltonian Monte-Carlo, namely an elliptical slice sampler
- Performing inference via an ESS is more robust to poor hyperparameter initialisation
- An ESS is often able to explore the posterior space more efficiently in the case of highly dependent Gaussian variables.
- Introduced sparse approximation methods
- Subset of regressors, deterministic training conditional, fully independent training conditional and full-scale approximation are all available
- Extended functionality to include leave-one-out cross-validation
- Introduced functionality to enable variational inference in GPs with non-Gaussian data.
- The approach used is a variant of that presented in Khan et. al.
- Currently limited to only Poisson data. Additional likelihood functionality will be added soon.
- Deprecated
GPMC
in favor ofGPA
. This is to be inline with that fact that approximate inference in a GP is not limited to MCMC, but variational methods can now be used.
- Introduction of
ElasticGPE
to allowGP
which can grow without "refitting" whole Gaussian process (see also #88). - Various performance and interface improvements to Kernels
- Introduction of
ADKernel
to simplify introduction of new/custom kernels through use of auto-differentiation.
- Updated Julia requirement to v1.0
- Updated requirement of
RecipesBase.jl
to v0.6
Note: Could not create release compatible with both Julia v0.7 and Julia v1.0 due to RecipesBase.jl
dependency
- Updated Julia requirement to v0.7
- Performance improvements to Kernels
- Added dependencies on
StatsFuns
andSpecialFunctions
- Removed dependency on
Compat
- Renamed
FixedKern
toFixedKernel
- Added type parameters to
GPE
,GPMC
,ProdKernel
, andSumKernel
,ProdMean
andSumMean
- Renamed fields of
GPE
andGPE
(x
instead ofX
,mean
instead ofm
,kernel
instead ofk
, andnobs
instead ofnobsv
) - Renamed fields of
FixedKernel
andMasked
(kernel instead of kern) - Renamed fields of
ProdKernel
andSumKernel
(kernels instead of kerns) - Renamed keyword arguments of the
GPE
constructor to kernel and mean - Renamed function
subkernels
andsubmeans
tocomponents
- Updated optimization code to be compatible with new Optim.jl API
- Removed Klara dependency
- Performance improvements to
predict
functions
- Updated Julia version requirement to 0.6
- GP type has been renamed to GPE (GP exact) for Gaussian likelihoods
- Introduced GPMC type for fitting models with non-Gaussian likelihoods:
- Bernouilli, Poisson, Binomial and student-t likelihoods available
- Introduced priors for parameters of the kernal, mean, and likelihood functions
- MCMC available for GPE and GPMC type
- Changed plotting functions to use Plots.jl
- Created notebooks illustrating package features
- Julia requirement moved up to version 0.5
- Major speed improvements for fitting of GP object, and for covariance and gradient calculations
- New
Masked
kernel - Various bug fixes
- Introduced
KernelData
type to recycle calculations - Removed Winston plotting functions and implemented PyPlot as an alternative
- Created methods for
mean
andcov
functions of theMean
andKernel
objects - Fixed
optimize!
function to be consistent with most recent version of Optim.jl - Improvements to the
Periodic
kernel fit!
function no longer exported due to clash with a few packages
- Added fit! function to fit a new set observations to existing GP object
- Julia requirement moved up to v0.4
- Support added for ScikitLearn
- rand and rand! functions added to sample prior and posterior paths of Gaussian process
- Major speed improvements for gradient calculations of stationary ARD kernels
- Minor fixes for some kernels
- Fixed plotting deprecation errors with Julia 0.4
- Major speed improvements to kernel calculations, in particular to stationary and composite kernels
- Fixed depraction warnings for Julia v0.4
- All stationary kernels have the super type Stationary
- Distance matrix calculations outsourced to Distances
- Improvements in speed for predict and fitting functions
- Positive definite matrix calculations outsourced to PDMats