Refereed Journal Papers

  1. T. Nakagawa, Y. Sanada, H. Waida, Y. Zhang, Y. Wada, K. Takanashi, T. Yamada, T. Kanamori,
    Denoising Cosine Similarity: A Theory-Driven Approach for Efficient Representation Learning.
    Neural Networks, Volume 169, January 2024, Pages 226-241.
  2. L. Andéol, Y. Kawakami, Y. Wadad, T. Kanamori,, K. R. Müller, G. Montavon,
    Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization Learning Systems.
    Neural Networks, Volume 167, October 2023, Pages 233-243.
  3. Y. Zhang, Y. Wada, H. Waida, K. Goto, Y. Hino,T. Kanamori,
    Deep Clustering with a Constraint for Topological Invariance based on Symmetric InfoNCE.
    Neural Computation, July 2023.
  4. S. Liu, T. Kanamori, and D. J. Williams,
    Estimating Density Models with Truncation Boundaries using Score Matching.
    Journal of Machine Learning Research, June 2022.
  5. Y. Mae, W. Kumagai, T. Kanamori,
    Uncertainty Propagation for Dropout-based Bayesian Neural Networks.
    Neural Networks, December 2021.
  6. Y. Wada, S. Miyamoto, T. Nakagawa, L. ANDEOL, W. Kumagai, T. Kanamori,
    Spectral Embedded Deep Clustering.
    Entropy Journal, vol. 21, no. 8, 795, August 2019.
  7. Y. Wada, S. Su, W. Kumagai, T. Kanamori,
    Robust Label Prediction via Label Propagation and Geodesic k-Nearest Neighbor in Online Semi-Supervised Learning.
    IEICE Transactions on Information and Systems, Vol.E102-D, No.8, August 2019.
  8. K. Matsui, W. Kumagai, K. Kanamori, M. Nishikimi, T. Kanamori,
    Variable Selection for Nonparametric Learning with Power Series Kernels.
    Neural Computation, 31(8):1718-1750, August 2019.
  9. W. Kumagai, T. Kanamori,
    Risk Bound of Transfer Learning using Parametric Feature Mapping and Its Application to Sparse Coding.
    Machine learning 108, pp. 1975--2008, May 2019.
  10. T. Kanamori, N. Osugi
    Model Description of Similarity-based Recommendation Systems.
    Entropy, vol. 21, no. 7, 702, July 2019.
  11. K. Sudo, N. Osugi, T. Kanamori,
    Numerical Study of Reciprocal Recommendation with Domain Matching.
    Japanese Journal of Statistics and Data Science, Volume 2, Issue 1, pp 221--240, June 2019.
  12. H. Sasaki, T. Kanamori, A. Hyvarinen, and M. Sugiyama,
    Mode-Seeking Clustering and Density Ridge Estimation via Direct Estimation of Density-Derivative-Ratios.
    Journal of Machine Learning Research, Volume 18, Pages, 1--47, April, 2018.
  13. T. Kanamori, T. Takenouchi,
    Graph-based Composite Local Bregman Divergences on Discrete Sample Spaces.
    Neural Networks, Volume 95, Pages 44--56, November 2017.
  14. T. Kanamori, S. Fujiwara, A. Takeda
    Robustness of Learning Algorithms using Hinge Loss with Outlier Indicators.
    Neural Networks, Volume 94, Pages 173--191, October 2017.
  15. K. Matsui, W. Kumagai, T. Kanamori,
    Parallel Distributed Block Coordinate Descent Methods based on Pairwise Comparison Oracle.
    Journal of Global Optimization, Volume 69, Issue 1, pp 1--21, September 2017.
  16. T. Takenouchi, T. Kanamori,
    Statistical Inference with Unnormalized Discrete Models and Localized Homogeneous Divergences.
    Journal of Machine Learning Research, vol. 18, num. 56, pages 1--26, July 2017.
  17. S. Fujiwara, A. Takeda, T. Kanamori,
    DC Algorithm for Extended Robust Support Vector Machine.
    Neural Computation, vol. 29, num. 5, pages 1406--1438, May 2017.
  18. T. Kanamori, S. Fujiwara, A. Takeda
    Breakdown Point of Robust Support Vector Machines.
    Entropy, vol. 19, no. 2, 83, February 2017
  19. T. Kanamori,
    Efficiency Bound of Local Z-Estimators on Discrete Sample Spaces.
    Entropy, vol. 18, no. 7, 273, July 2016,
  20. T. Kanamori, H. Fujisawa
    Robust Estimation under Heavy Contamination using Unnormalized Models.
    Biometrika, vol. 102, no. 3, pp. 559-572, Sep. 2015.
  21. A. Takeda, S. Fujiwara, T. Kanamori,
    Extended Robust Support Vector Machine Based on Financial Risk Minimization.
    Neural Computation, vol. 26, num. 11, pp. 2541-2569, Nov. 2014.
  22. T. Kanamori and H. Fujisawa,
    Affine Invariant Divergences associated with Proper Composite Scoring Rules and their Applications.
    Bernoulli, vol. 20, No. 4, pp. 2278-2304, Nov. 2014
  23. T. Kanamori and A. Takeda,
    A Numerical Study of Learning Algorithms on Stiefel Manifold.
    Computational Management Science, vol. 11, Issue 4, pp 319-340, Oct. 2014.
  24. A. Takeda, T. Kanamori,
    Using Financial Risk for Analyzing Generalization Performance of Machine Learning Models.
    Neural Networks, vol. 57, pp. 29-38, Sep, 2014.
  25. T. D. Nguyen, M. C. du Plessis, T. Kanamori, M. Sugiyama,
    Constrained Least-Squares Density-Difference Estimation.
    IEICE Transactions on Information and Systems, vol. E97-D, no. 7, pp. 1822-1829, July, 2014.
  26. T. Kanamori,
    Scale-Invariant Divergences for Density Functions.
    Entropy, vol 16(5), pp. 2611-2628, May 2014.
  27. T. Kanamori and M. Sugiyama,
    Statistical Analysis of Distance Estimators with Density Differences and Density Ratios.
    Entropy, vol. 16 (2), pp. 921-942, Feb. 2014.
  28. T. Kanamori, A. Ohara,
    A Bregman extension of quasi-Newton updates II: analysis of robustness properties.
    Journal of Computational and Applied Mathematics, vol. 253, pp. 104-122, Dec. 2013.
  29. M. Sugiyama, T. Kanamori, T. Suzuki, M. C. du Plessis, S. Liu, I. Takeuchi,
    Density Difference Estimation.
    Neural Computation, vol. 25(10), pp. 2734-2775, Oct. 2013.
  30. T. Kanamori, A. Takeda, T. Suzuki,
    Conjugate Relation between Loss Functions and Uncertainty Sets in Classification Problems.
    Journal of Machine Learning Research, vol. 14, pp. 1461-1504, June, 2013.
  31. M. Sugiyama, S. Liu, M. C. du Plessis, Y. Yamanaka, M. Yamada, T. Suzuki, T. Kanamori,
    Direct Divergence Approximation between Probability Distributions and Its Applications in Machine Learning.
    Journal of Computing Science and Engineering, vol. 7, no. 2, pp.99-111, June, 2013.
  32. M. Yamada, T. Suzuki, T. Kanamori, H. Hachiya, M. Sugiyama,
    Relative Density-Ratio Estimation for Robust Distribution Comparison.
    Neural Computation, vol. 25, No. 5, pp. 1324-1370, May 2013.
  33. M. Kawakita, T. Kanamori,
    Semi-Supervised Learning with Density-Ratio Estimation. [arXiv]
    Machine Learning, Volume 91, Issue 2, pp 189-209, May 2013.
  34. T. Kanamori,
    Statistical Models and Learning Algorithms for Ordinal Regression Problems.
    Information Fusion, vol. 14, issue 2, pp. 199-207, April 2013.
  35. T. Kanamori, T. Takenouchi,
    Improving LogitBoost with Prior Knowledge.
    Information Fusion, vol. 14, issue 2, pp. 208-219, April 2013.
  36. A. Takeda, H. Mitsugi, T. Kanamori,
    A Unified Classification Model Based on Robust Optimization.
    Neural Computation, Vol. 25, No. 3, Pages 759-804, March 2013.
  37. T. Kanamori, T. Suzuki, M. Sugiyama,
    Computational Complexity of Kernel-Based Density-Ratio Estimation: A Condition Number Analysis. [arXiv]
    Machine Learning, vol. 90, issue 3, pp. 431-460, March 2013.
  38. T. Kanamori, A. Ohara,
    A Bregman Extension of quasi-Newton updates I: An Information Geometrical framework. [arXiv]
    Optimization Methods and Software, vol. 28, issue 1, pp. 96-123, February 2013.
  39. M. Sugiyama, T. Suzuki, T. Kanamori,
    Density-ratio matching under the Bregman divergence: A unified framework of density-ratio estimation. [site]
    Annals of the Institute of Statistical Mathematics, vol. 64, no. 5, pp. 1009-1044, October 2012.
  40. T. Kanamori, H. Uehara, M. Jimbo,
    Pooling Design and Bias Correction in DNA Library Screening. [arXiv]
    Journal of Statistical Theory and Practice, vol. 6, issue 1, pp. 220-238, March 2012.
  41. T. Kanamori, T. Suzuki, M. Sugiyama,
    Statistical analysis of kernel-based least-squares density-ratio estimation. [site]
    Machine Learning, vol. 86, Issue 3, pp. 335-367, March 2012.
  42. T. Kanamori, T. Suzuki, M. Sugiyama,
    f-divergence estimation and two-sample homogeneity test under semiparametric density-ratio models. [arXiv]
    IEEE Transactions on Information Theory, Vol. 58, Issue 2, pp. 708-720, February 2012.
  43. T. Kanamori, A. Takeda,
    Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty. [arXiv]
    Journal of Optimization Theory and Applications, vol. 152, Issue 1, pp.171-197, January 2012.
  44. H. Shimodaira, T. Kanamori, M. Aoki, K. Mine,
    Multiscale Bagging and its Applications. [site]
    IEICE Transactions on Information and Systems, Volume E94-D No.10, pp.1924-1932, October 2011.
  45. M. Sugiyama, T. Suzuki, Y. Itho, T. Kanamori, M. Kimura,
    Least-Squares Two-Sample Test. [site]
    Neural Networks, vol. 24, issue 7, pp. 735-751, September, 2011.
  46. S. Hido, Y. Tsuboi, H. Kashima, M. Sugiyama, T. Kanamori,
    Statistical Outlier Detection Using Direct Density Ratio Estimation. [site]
    Knowledge and Information Systems, vol. 26, num. 2, pp. 309-336, August, 2011.
  47. M. Sugiyama, M. Yamada, von Bunau P., T. Suzuki, T. Kanamori, M. Kawanabe,
    Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search. [site]
    Neural Networks, vol. 24, pp. 183-198, March, 2011.
  48. T. Kanamori,
    Deformation of Log-Likelihood Loss Function for Multiclass Boosting. [site]
    Neural Networks, vol. 23, issue 7, pp. 843-864, May, 2010.
  49. T. Kanamori, T. Suzuki, M. Sugiyama,
    Theoretical Analysis of Density Ratio Estimation. [site]
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E93-A, no. 4, pp. 787-798, April, 2010.
  50. M. Sugiyama, I. Takeuchi, T. Suzuki, T. Kanamori, H. Hachiya, D. Okanohara,
    Least-Squares Conditional Density Estimation. [site]
    IEICE Transactions on Information and Systems, vol.E93-D, no.3, pp.583-594, March, 2010.
  51. A. Takeda, T. Kanamori,
    A Robust Approach Based on Conditional Value-at-Risk Measure to Statistical Learning Problems. [site]
    European Journal of Operational Research, vol. 198, issue 1, pp. 287-296, Oct., 2009.
  52. M. Sugiyama, T. Kanamori, T. Suzuki, Shohei Hido, Jun Sese, Ichiro Takeuchi, and Liwei Wang,
    A Density-ratio Framework for Statistical Data Processing. [site]
    IPSJ Computer Vision and Application. vol. 1, pp. 183-208, Sep., 2009
  53. T. Kanamori, S. Hido, M. Sugiyama,
    A Least-squares Approach to Direct Importance Estimation. [site]
    Journal of Machine Learning Research. 10(Jul):1391-1445, July, 2009.
  54. I. Takeuchi, K. Nomura, T. Kanamori,
    Nonparametric Conditional Density Estimation Using Piecewise-Linear Path Following for Kernel Quantile Regression. [site]
    Neural Computation, vol. 21, num. 2, pp. 533-559, Feb., 2009.
  55. T. Suzuki, M. Sugiyama, T. Kanamori, and J. Sese,
    Mutual information estimation reveals global associations between stimuli and biological processes. [site]
    BMC Bioinformatics, vol. 10, no. 1, pp.S52, Jan., 2009.
  56. T. Takenouchi, S. Eguchi, N. Murata, T. Kanamori,
    Robust Boosting Algorithm against Mislabelling in Multi-Class Problems. [site]
    Neural Computation, vol. 20, num. 6, pp. 1596-1630, June, 2008.
  57. T. Kanamori,
    Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability. [site]
    IEICE Transactions on Information and Systems, Vol.E90-D, No.12, pp. 2033-2042, Dec., 2007.
  58. T. Kanamori,
    Pool-based Active Learning with Optimal Sampling Distribution and its Information Geometrical Interpretation. [site]
    Neurocomputing, Vol. 71, Issue 1-3, pp. 353-362, Dec., 2007.
  59. T. Kanamori, , T. Takenouchi, S. Eguchi, N. Murata,
    Robust Loss Functions for Boosting. [site]
    Neural Computation, 19(8), pp. 2183-2244, Aug., 2007
  60. T. Kanamori, T. Takenouchi, N. Murata,
    Geometrical Structure of Boosting Algorithm.
    New Generation Computing, Tutorial Series on Brain-Inspired Computing, Part 6, 25(1):117-141, Nov., 2006.
  61. T. Kanamori, and I. Takeuchi,
    Conditional Mean Estimation under Asymmetric and Heteroscedastic Error by Linear Combination of Quantile Regressions.
    Computational Statistics and Data Analysis, Vol 50, Issue 12, pp 3605-3618, Aug., 2006.
  62. N. Murata, T. Takenouchi, T. Kanamori, S. Eguchi,
    Information Geometry of U-Boost and Bregman Divergence.
    Neural Computation, 16(7):1437-1481, July 2004.
  63. T. Kanamori, H. Shimodaira,
    Active Learning algorithm using the maximum weighted log-likelihood estimator
    Journal of Statistical Planning and Inference, Vol. 116, Issue 1, pp. 149-162, Sep., 2003.
  64. I. Takeuchi, Y. Bengio, T. Kanamori,
    Robust Regression with Asymmetric Heavy-Tail Noise distributions.
    Neural Computation, Vol. 14, Num. 10, pp. 2469-2496, Oct., 2002.
  65. T. Kanamori,
    Statistical Asymptotic Theory of Active Learning.
    Annals of the Institute of Statistical Mathematics, Vol. 54, Num. 3, pp. 459-475, Sep., 2002.
  66. T. Kanamori, H. Shimodaira,
    An Active Learning Algorithm Using an Information Criterion for the Maximum Weighted Log-likelihood Estimator.
    Proceedings of the Institute of Statistical Mathematics, Vol, 48, No. 1, 197-212, June, 2000.
  67. T. Kanamori,
    Active Learning Algorithm using Maximum Weighted Likelihood Estimator.
    Bulletin of the Computational Statistics in Japan, vol. 11, Num. 2, pp. 65-75, Oct., 1998.

International Conference

  1. H. Irobe, W. Aoki, K. Yamazaki, Y. Zhang, T. Nakagawa, H. Waida, Y. Wada, T. Kanamori,
    Robust VAEs via Generating Process of Noise Augmented Data.
    IEEE International Symposium on Information Theory (IEEE ISIT), 2024.
  2. P. Srey, Y. Zhang, T. Kanamori,
    Open-World Learning Under Dataset Shift.
    IEEE Conference on Artificial Intelligence (IEEE CAI), 2024.
  3. Y. Kokubun, K. Matsui, K. Kutsukake, W. Kumagai, T. Kanamori,
    Local Acquisition Function for Active Level Set Estimation.
    NeurIPS 2023 Workshop on Adaptive Experimental Design and Active Learning in the Real World (RealML), 2023.
  4. H. waida, Y. Wada, L. Andeol, T. Nakagawa, Y. Zhang, T. Kanamori,
    Towards Understanding the Mechanism of Contrastive Learning via Similarity Structure: A Theoretical Analysis.
    ECML 2023.
  5. Y. Sanada, T. Nakagawa, Y. Wada, K. Takanashi, Y. Zhang, K. Tokuyama, T. Kanamori, T. Yamada,
    Deep Self-Supervised Learning of Speech Denoising from Noisy Speeches.
    INTERSPEECH 2022, September 2022.
  6. H. Sasaki, J. Hirayama, T. Kanamori,
    Mode estimation on matrix manifolds: Convergence and robustness.
    The 25th International Conference on Artificial Intelligence and Statistics (AISTATS2022). March 2022.
  7. H. Sasaki, T Sakai, T. Kanamori,
    Robust modal regression with direct gradient approximation of modal regression risk.
    The Conference on Uncertainty in Artificial Intelligence (UAI2020). August 2020.
  8. M. Uehara, T. Kanamori, T. Takenouchi, T. Matsuda,
    A Unified Statistically Efficient Estimation Framework for Unnormalized Models.
    The 23rd International Conference on Artificial Intelligence and Statistics (AISTATS 2020). June 2020.
  9. S. Liu, T. Kanamori, W. Jitkrittum, Y. Chen,
    Fisher Efficient Inference of Intractable Models.
    The Neural Information Processing Systems (NeurIPS 2019), December 2019.
  10. K. Matsui, W. Kumagai, K. Kanamori, M. Nisikimi, S. Matsui, T. Kanamori,
    Foundations of transfer learning and its application to multi-center prognostic prediction.
    2019 WNAR/IMS/JR Annual Meeting, Portland, Oregon, USA on June 23-26, 2019.
  11. H. Sasaki, T. Kanamori, A. Hyvarinen, G. Niu, M. Sugiyama
    Hunting geometric features in the probability density function with direct density-derivative-ratio estimation.
    The 11th International Conference of the ERCIM WG on Computational and Methodological Statistics (CMStatistics 2018), December 2018.
  12. K. Matsui, W. Kumagai, T. Kanamori,
    Parallel Distributed Block Coordinate Descent Methods Based on Pairwise Comparison Oracle..
    the 2017 INFORMS ANNUAL MEETING, Houston, Texas, USA on October 22-25, 2017.
  13. Hiroaki Sasaki, Takafumi Kanamori and Masashi Sugiyama,
    Estimating Density Ridges by Direct Estimation of Density-Derivative-Ratios.
    the 20th International Conference on Artificial Intelligence and Statistics (AISTATS), Proceedings of Machine Learning Research, vol.54, pp.204--212, 2017.
  14. Takenouchi T, Kanamori T.
    Empirical Localization of Homogeneous Divergences on Discrete Sample Spaces.
    The Neural Information Processing Systems (NIPS 2015), poster & spotlight, December 2015.
  15. Kanamori T.
    Legendre Transformation in Machine Learning.
    Workshop: Information Geometry for Machine Learning, December 2014.
  16. Fujisawa, H., Kanamori T.
    Affine invariant divergences with applications to robust statistics.
    The 7th International Conference of the ERCIM WG on Computational and Methodological Statistics (ERCIM 2014), the University of Pisa, Italy, 6-8 December 2014.
  17. Kanamori T., Fujisawa, H.
    Affine Invariant Divergences and their Applications.
    The 3rd Institute of Mathematical Statistics, Asia Pacific Rim Meeting, June 29-July 3, 2014.
  18. Sugiyama M., Kanamori T., Suzuki T., Plessis M., Liu S., Takeuchi I.
    Density-Difference Estimation.
    The Neural Information Processing Systems (NIPS 2012), Lake Tahoe, Nevada, United States, 3-8 Dec., 2012.
  19. Kanamori T., Takeda A.
    Non-Convex Optimization on Stiefel Manifold and Applications to Machine Learning.
    The 19th International Conference on Neural Information Processing (ICONIP 2012), Doha, Qatar, 12-15 Nov., 2012.
  20. Takeda A. Kanamori T., Mitsugi H.
    Robust optimization-based classification method.
    The 21st International Symposium on Mathematical Programming (ISMP 2012), Berlin, Germany, 19-24 Aug., 2012.
  21. Kanamori T., Suzuki, T., Sugiyama, M.
    f-divergence estimation and two-sample test under semi-parametric density ratio models.
    The 2nd Institute of Mathematical Statistics, Asia Pacific Rim Meeting (ims-APRM 2012), Tsukuba, Japan, 2-4 July, 2012.
  22. Takeda, A., Mitsugi, H., Kanamori, T.
    A Unified Robust Classification Model.
    29th International Conference on Machine Learning (ICML2012), Edinburgh, Scotland, Jun. 26-Jul. 1, 2012.
  23. Kanamori, T., Takeda, A., Suzuki, T.
    A Conjugate Property between Loss Functions and Uncertainty Sets in Classification Problems.
    25th International Conference on Learning Theory (COLT2012), Edinburgh, Scotland, Jun. 25-Jun. 27, 2012.
  24. Yamada, M., Suzuki, T., Kanamori, T., Hachiya, H., & Sugiyama, M.
    Relative density-ratio estimation for robust distribution comparison.
    Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 13-15, 2011
  25. Shimodaira H.Kanamori T., Masayoshi A., Kouta Mine
    Multiscale Bagging with Applications to Classification and Active Learning.
    The 2nd Asian Conference on Machine Learning, Nov. 2010.
  26. A. Masayoshi, Kanamori T., Shimodaira H
    Multiscale-bagging with Applications to Classification.
    The 2nd Asian Conference on Machine Learning, Nov. 2010.
  27. Kanamori T. and Ohara Atsumi
    A Bregman extension of quasi-Newton updates.
    Information Geometry and its Applications, Germany, Aug. 2010.
  28. Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D.,
    Conditional density estimation via least-squares density ratio estimation.
    In Proceedings of Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS2010), JMLR Workshop and Conference Proceedings, vol.9, pp.781-788, Sardinia, Italy, May 13-15, 2010.
  29. Sugiyama, M., Hara, S., von Bünau, P., Suzuki, T., Kanamori, T., & Kawanabe, M.,
    Direct density ratio estimation with dimensionality reduction.
    In S. Parthasarathy, B. Liu, B. Goethals, J. Pei, and C. Kamath (Eds.), Proceedings of the 10th SIAM International Conference on Data Mining (SDM2010), pp.595-606, Columbus, Ohio, USA, Apr. 29-May 1, 2010.
  30. T. Kanamori
    Efficient direct importance estimation for covariate shift adaptation and outlier detection.
    The 1st Institute of Mathematical Statistics, Asia Pacific Rim Meeting, Seoul, June 28-July 1, 2009
  31. T. Kanamori, T. Suzuki, M. Sugiyama,
    Condition Number Analysis of Kernel-based Density Ratio Estimation.
    ICML workshop on Numerical Mathematics in Machine Learning, Montreal Canada, June 2009.
  32. Suzuki, Sugiyama, Kanamori, Sese,
    Mutual Information Estimation Reveals Global Associations between Stimuli and Biological Process.
    The 7th Asia Pacific Bioinformatics Conference (APBC2009) Beijing, China, 13-16 January 2009.
  33. Kanamori, T.
    A Least-squares Approach to Direct Importance Estimation and its Applications..
    Joint Session of the CSA, JSS and KSS at 2008 Statistical Symposium, China, Taipei, Dec. 19, 2008.
  34. Hido, S., Tsuboi, Y., Kashima, H., Sugiyama, M., Kanamori, T..
    Inlier-based outlier detection via direct density ratio estimation.
    In xxx and xxx (Eds.), Proceedings of IEEE International Conference on Data Mining (ICDM2008), pp.xxx-xxx, Pisa, Italy, Dec. 15-19, 2008.
  35. Takafumi Kanamori, Masashi Sugiyama, and Shohei Hido
    Efficient Direct Density Ratio Estimation for Non-stationarity Adaptation and Outlier Detection.
    NIPS, 2008
  36. Taiji Suzuki, Masashi Sugiyama, Jun Sese, and Takafumi Kanamori.
    A least-squares approach to mutual information estimation with application in variable selection.
    Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium, Sep. 15, 2008
  37. Suzuki, T., Sugiyama, M., Sese, J. and Kanamori, T.
    Approximating mutual information by maximum likelihood density ratio estimation.
    In xxx and yyy (Eds.), Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008),
    JMLR Workshop and Conference Proceedings, vol. xxx, pp.yyy-zzz, 2008.
    (Presented at Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium, Sep. 15, 2008.)
  38. Kanamori, T.
    Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability.
    18th International Conference on Algorithmic Learning Theory, Sendai International Center, Sendai, Japan, 2007.
  39. Kanamori, T.
    Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty.
    International Conference on Continuous Optimization, Hamilton, Canada, 2007.
  40. Kanamori, T. and Takeda, A.
    Worst-Case Violation of Sampled Convex Programs for Optimization with Uncertainty.
    International Symposium on Mathematical Programming, Dio de Janeiro, Brazil, 2006.
  41. Takeuchi, I., Nomura, K. and Kanamori, T.
    The Entire Solution Path of Kernel-based Nonparametric Conditional Quantile Estimator.
    International Joint Conference on Neural Networks, Vancouver, Canada, 2006,
  42. Kanamori, T.
    Integrability of weak learner on boosting.
    The 2nd International Symposium on Information Geometry and its Applications, pp. 300-307, University of Tokyo, Tokyo, Japan, 2005.
  43. Kanamori, T. and Takeuchi, I.
    Estimators for Conditional Expectations under Asymmetric and Heteroscedastic Error Distributions.
    International Symposium on The Art of Statistical Metaware, The Institute of Statistical Mathematics, Tokyo, Japan, 2005.
  44. Kanamori T., Takenouchi, T., Eguchi, S., and Murata, N.
    The most robust loss function for boosting.
    Lecture Notes in Computer Science Neural Information Processing: 11th International Conference, ICONIP 2004, Calcutta, Vol. 3316, pp.496-501, Springer.
  45. Kanamori T. and Takeuchi, I.
    Robust Estimation of Conditional Mean by the Linear Combination of Quantile Regressions.
    International Conference on Robust Statistics, Beijing, China, 2004.
  46. Kanamori, T.
    A New Sequential Algorithm for Regression Problems by using Mixture Distribution.
    In Proceedings of 2002 International Conference on Artificial Neural Networks (ICANN'02), pp. 535-540, Madrid, Spain, August 2002.
  47. Bengio, Y., Takeuchi, I. and Kanamori, T.
    The Challenge of Non-Linear Regression on Large Datasets with Asymmetric Heavy Tails.
    In Proceedings of the Joint Statistical Meeting. American Statistical Association, New York, U.~S.~A., August 2002.
  48. Shimodaira, H., and Kanamori, T.
    Information Criteria for Predictive Inference with the Weighted Log-Likelihood and the Active Learning.
    International Society for Bayesian Analysis, Sixth World Meeting Hersonissos, Heraklion, Crete, May 2000.

Articles

  1. Kanamori, Kabashima, Takayasu, Nakano, Fukuda, Miyoshi, Yamashita, Watanabe,
    Introduction of Operations Research at Dept. of Mathematical and Computing Science in Tokyo Institute of Technology (in Japanese),
    Operations Research, Vol.64, No.1, 2019.
  2. Kanamori, T,
    Divergence Estimation using Density-ratio and its Applications (in Japanese),
    Japan Statistical Society, Sep., 2014.
  3. Kanamori, T,
    Statistics --statistical learning theory (in Japanese),
    Suugaku seminar, May, 2011.
  4. Sugiyama, M., Suzuki, T., Kanamori, T. ,
    Density ratio estimation: A comprehensive review.
    In Statistical Experiment and Its Related Topics, Research Institute for Mathematical Sciences Kokyuroku, no.1703, pp.10-31, 2010.
  5. Murata, N., Kanamori, T, Takenouchi, T.
    Boosting and Learning algorithm (in Japanese),
    The Journal of the Institute of Electronics, Information and Communication Engineers. 88(9), pp.724-729, 2005.
  6. Kanamori, T., Murata, N.
    Boosting and Robustness (in Japanese)
    The Journal of the Institute of Electronics, Information and Communication Engineers. Vol. 86, No. 10, pp. 769-772, 2003.

Patent

  1. 前 佑樹,金森 敬文
    演算装置および学習済みモデル
    出願番号:特願2021-555926,出願日:2020/09/17,登録番号:第7386462号,登録日:2023/11/16

Books

  1. T. Kanamori, Ed.,
    Data Science and Machine Learning (Translation)
    Tokyo Kagakua Dojin, Dec. 2022.
  2. T. Kanamori,
    Introduction to Statistical Machine Learning with Python (in Japanese)
    Ohmsya, Nov. 2018.
  3. T. Kanamori,
    Introduction to Machine Learning with R (in Japanese)
    Ohmsya, Nov. 2017.
  4. T. Kanamori, T. Suzuki, I. Takeuchi, I. Sato
    Continuous Optimization for Machine Learning (in Japanese)
    Kodansya scientific, Dec. 2016.
  5. T. Kanamori
    Statistical Learning Theory (in Japanese)
    Kodansya scientific, Aug. 2015.
  6. M. Sugiyama, T. Suzuki, T. Kanamori
    Density Ratio Estimation in Machine Learning
    Cambridge University Press, Feb. 2012.
  7. T. Kanamori, T. Takenouchi, N. Murata
    Pattern Recognition in R (in Japanese)
    Kyoritu Syuppan, Oct. 2009.
  8. T. Kanamori,K. Hatano, O. Watanabe
    Boosting (in Japanese)
    Morikita Syuppan, Sep. 2006.

Book Chapters

  1. Multivariate Analysis
    in New Handbook of Medical Statistics (in Japanese), Asakura Syoten, July 2018.
        
  2. Loss Functions and Risk Measures in Statistical Learning Theory
    in Aspects of Modeling (in Japanese), Kindai Kagakusya, Sep. 2016.
        
  3. [Translation] Randomized Optimization
    in Handbook of Monte Carlo Methods (in Japanese), Asakura Syoten, Oct. 2014.
        
  4. [Translation] Model Assessment and Selection
    in The Elements of Statistical Learning (in Japanese), Kyoritsu Syoten, June. 2014.
        
  5. Statistical Learning Theory
    in Handbook of Applied Mathematics (in Japanese)
    Asakura Syoten, Nov. 2013.
        
  6. Ensemble Learning
    in Encyclopedia of Mathematical Engineering (in Japanese)
    Asakura Shoten, Nov. 2011.
  7. T. Kanamori, H. Shimodaira,
    Geometry of Covariate Shift with Applications to Active Learning
    in Dataset Shift in Machine Learning, MIT Press, 2008.

kanamori's web site