HomePage of Kazuho Watanabe

[Japanese]

Dept. of Computer Science and Engineering, Toyohashi University of Technology
e-mail: wkazuho <<p}p{>> cs.tut.ac.jp

CV, Publications

--News--------------------------------------
  • IMSI Workshop (Univ. of Chicago, USA, Dec. 11-12)
    Bayesian Statistics and Statistical Learning: New Directions in Algebraic Statistics
    "Rate-distortion theoretical views of Bayesian learning coefficients"
  • IEEE ISIT2021 (Virtual Conf., Jul. 12-20)
    "Statistical Learning of the Insensitive Parameter in Support Vector Models"
  • IEEE ITW2020 (Virtual Conf., Apr. 11-15, 2021)
    Masahiro Kobayashi, Kazuho Watanabe,
    Unbiased Estimation Equation under f-Separable Bregman Distortion Measures
  • The following paper is to appear in Neurocomputing.
    Masahiro Kobayashi, Kazuho Watanabe,
    Generalized Dirichlet-process-means for f-separable distortion measures <arXiv>
    Neurocomputing, Special Issue on Advanced Methods in Optimization and Machine Learning for Heterogeneous Data Analytics
  • IEEE IJCNN2020 (Virtual Conf., Jul. 19-24)
    Daisuke Kaji, Kazuho Watanabe, Masahiro Kobayashi
    "Multi-Decoder RNN Autoencoder Based on Variational Bayes Method"
  • IEEE ISIT2020 (Virtual Conf., Jun. 21-26)
    "Discrete Optimal Reconstruction Distributions for Itakura-Saito Distortion Measure"
  • 2020 ITA Workshop (San Diego, USA, Feb. 3)
    "Rate-distortion theoretic interpretation of Bayesian learning coefficients"
  • ACML2019 (Nagoya, Japan, Nov. 17-19)
    Kenta Konagayoshi, Kazuho Watanabe
    "Minimax Online Prediction of Varying Bernoulli Process under Variational Approximation" <pdf, supplementary>
  • The following book is published from Cambridge Univ. Press.:
    "Variational Bayesian Learning Theory" (co-authored with Shinichi Nakajima and Masashi Sugiyama)
  • WITMSE2017 (Paris, France, Sep. 11-13)
    "Rate-distortion dimension and Bayesian learning coefficient"
  • The following two papers have been published in Entropy.
    Projection to Mixture Families and Rate-Distortion Bounds with Power Distortion Measures
    Special Issue "Information Geometry II", 19(6), 262, 2017.
    Rate-Distortion Bounds for Kernel-Based Distortion Measures
    Special Issue "Information Theory in Machine Learning and Data Science", 19(7), 336, 2017.
  • IEEE ISIT2017 (Aachen, Germany, Jun. 25-30)
    "Rate-Distortion Tradeoffs under Kernel-Based Distortion Measures"

  • WITMSE2016 (Univ. of Helsinki, Finland, Sep. 19-21)
  • IEEE ITW2016 (Univ. of Cambridge, UK, Sep. 11-14)
    "Constant-Width Rate-Distortion Bounds for Power Distortion Measures"
    Also I am one of big fans of Prof. MacKay's book...
  • The following paper has been accepted to IEEE Trans. on Information Theory.
    Kazuho Watanabe and Shiro Ikeda,
    "Rate-Distortion Functions for Gamma-Type Sources under Absolute-Log Distortion Measure"
  • I made a presentation at HD3-2015 (Kyoto, Japan, Dec. 14-17).
  • I attended WITMSE2015 (Copenhagen, Denmark, June 24-26).
  • The following paper has been accepted to Neurocomputing:
    Kazuho Watanabe, "Vector Quantization Based on Epsilon-Insensitive Mixture Models"
  • The following paper has been accepted to IEEE PacificVis 2015:
    Kazuho Watanabe, Hsiang-Yun Wu, Yusuke Niibe, Shigeo Takahashi, and Issei Fujishiro,
    "Biclustering Multivariate Data for Correlated Subspace Mining"
  • The following paper has been accepted to the Journal of Machine Learning Research:
    Kazuho Watanabe and Teemu Roos,
    "Achievability of Asymptotic Minimax Regret by Horizon-Dependent and Horizon-Independent Strategies"
  • NIPS2014: Shinichi Nakajima, Issei Sato, Masashi Sugiyama, Kazuho Watanabe, and Hiroko Kobayashi,
    "Analysis of variational Bayesian latent Dirichlet allocation: weaker sparsity than MAP"
  • The following paper has been accepted to IEEE trans. on Neural Networks and Learning Systems:
    Takuya Konishi, Takatomi Kubo, Kazuho Watanabe, and Kazushi Ikeda,
    "Variational Bayesian inference algorithms for infinite relational model of network data"
  • The following paper has been accepted to Machine Learning journal:
    Kazuho Watanabe and Shiro Ikeda,
    "Entropic risk minimization for nonparametric estimation of mixing distributions"
  • iV2014: Koto Nohno, Hsiang-Yun Wu, Kazuho Watanabe, Shigeo Takahashi, and Issei Fujishiro,
    "Spectral-based contractible parallel coordinates"
  • WITMSE2014: Kazuho Watanabe,
    "Rate-Distortion Analysis for an Epsilon-Insensitive Loss Function"
  • ISIT2014: Andrew Barron, Teemu Roos, and Kazuho Watanabe,
    "Bayesian Properties of Normalized Maximum Likelihood and its Fast Computation"
  • The following paper has been accepted to IEEE trans. on Neural Networks and Learning Systems:
    Atsushi Miyamoto, Kazuho Watanabe, Kazushi Ikeda, and Masa-aki Sato,
    "Variational inference with ARD prior for NIRS diffuse optical tomography"
  • --------------------------------------------

    My former pages, etc.:
    Back to TUT-LISL

    Otlichnyi tehnicheskii universitet esti tut, v Toyohashi.