Published papers at TMLR

@tmlrpub@sigmoid.social
584 Followers
0 Following
608 Posts

One tweet per submission to Transactions on Machine Learning Research (TMLR).

Bot maintained by @fabian

Webhttps://jmlr.org/tmlr/

Mitigating Confirmation Bias in Semi-supervised Learning via Efficient Bayesian Model Averaging

Charlotte Loh, Rumen Dangovski, Shivchander Sudalairaj et al.

Action editor: Frederic Sala.

https://openreview.net/forum?id=PRrKOaDQtQ

#supervised #labeling #classifier

Mitigating Confirmation Bias in Semi-supervised Learning via...

State-of-the-art (SOTA) semi-supervised learning (SSL) methods have been highly successful in leveraging a mix of labeled and unlabeled data, often via self-training or pseudo-labeling. During...

OpenReview

HERMES: Hybrid Error-corrector Model with inclusion of External Signals for nonstationary fashion...

Etienne David, Jean Bellot, Sylvain Le Corff

Action editor: Makoto Yamada.

https://openreview.net/forum?id=4ofFo7D5GL

#forecasting #trend #fashion

HERMES: Hybrid Error-corrector Model with inclusion of External...

Developing models and algorithms to predict nonstationary time series is a long standing statistical problem. It is crucial for many applications, in particular for fashion or retail industries, to...

OpenReview

Detecting incidental correlation in multimodal learning via latent variable modeling

Taro Makino, Yixin Wang, Krzysztof J. Geras, Kyunghyun Cho

Action editor: Thang Bui.

https://openreview.net/forum?id=QoRo9QmOAr

#multimodal #modality #variational

Detecting incidental correlation in multimodal learning via latent...

Multimodal neural networks often fail to utilize all modalities. They subsequently generalize worse than their unimodal counterparts, or make predictions that only depend on a subset of modalities....

OpenReview

Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified Sketches

Tamim El Ahmad, Pierre Laforgue, Florence d'Alché-Buc

Action editor: Makoto Yamada.

https://openreview.net/forum?id=ry2qgRqTOw

#sparse #kernel #approximations

Fast Kernel Methods for Generic Lipschitz Losses via $p$-Sparsified...

Kernel methods are learning algorithms that enjoy solid theoretical foundations while suffering from important computational limitations. Sketching, which consists in looking for solutions among a...

OpenReview

Variational Elliptical Processes

Maria Margareta Bånkestad, Jens Sjölund, Jalil Taghia, Thomas B. Schön

Action editor: Sinead Williamson.

https://openreview.net/forum?id=djN3TaqbdA

#gaussian #variational #likelihood

Variational Elliptical Processes

We present elliptical processes—a family of non-parametric probabilistic models that subsumes Gaussian processes and Student's t processes. This generalization includes a range of new heavy-tailed...

OpenReview

Single-Pass Contrastive Learning Can Work for Both Homophilic and Heterophilic Graph

Haonan Wang, Jieyu Zhang, Qi Zhu, Wei Huang, Kenji Kawaguchi, Xiaokui Xiao

Action editor: Sinead Williamson.

https://openreview.net/forum?id=244KePn09i

#graphs #graph #nodes

Single-Pass Contrastive Learning Can Work for Both Homophilic and...

Existing graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss, which is effective for capturing the low-frequency...

OpenReview

Individual Privacy Accounting for Differentially Private Stochastic Gradient Descent

Da Yu, Gautam Kamath, Janardhan Kulkarni, Tie-Yan Liu, Jian Yin, Huishuai Zhang

Action editor: Naman Agarwal.

https://openreview.net/forum?id=l4Jcxs0fpC

#privacy #private #sgd

Individual Privacy Accounting for Differentially Private Stochastic...

Differentially private stochastic gradient descent (DP-SGD) is the workhorse algorithm for recent advances in private deep learning. It provides a single privacy guarantee to all datapoints in the...

OpenReview

A Survey on Causal Discovery Methods for I.I.D. and Time Series Data

Uzma Hasan, Emam Hossain, Md Osman Gani

Action editor: Patrick Flaherty.

https://openreview.net/forum?id=YdMrdhGx9y

#causal #causality #discovery

A Survey on Causal Discovery Methods for I.I.D. and Time Series Data

The ability to understand causality from data is one of the major milestones of human-level intelligence. Causal Discovery (CD) algorithms can identify the cause-effect relationships among the...

OpenReview

A DNN Optimizer that Improves over AdaBelief by Suppression of the Adaptive Stepsize Range

Guoqiang Zhang, Kenta Niwa, W. Bastiaan Kleijn

Action editor: Rémi Flamary.

https://openreview.net/forum?id=VI2JjIfU37

#optimizers #imagenet #optimizer

A DNN Optimizer that Improves over AdaBelief by Suppression of the...

We make contributions towards improving adaptive-optimizer performance. Our improvements are based on suppression of the range of adaptive stepsizes in the AdaBelief optimizer. Firstly, we show...

OpenReview

Faster Training of Neural ODEs Using Gauß–Legendre Quadrature

Alexander Luke Ian Norcliffe, Marc Peter Deisenroth

Action editor: Kevin Swersky.

https://openreview.net/forum?id=f0FSDAy1bU

#odes #models #quadrature

Faster Training of Neural ODEs Using Gauß–Legendre Quadrature

Neural ODEs demonstrate strong performance in generative and time-series modelling. However, training them via the adjoint method is slow compared to discrete models due to the requirement of...

OpenReview