摘要:Abstract Tokenization is the process of mapping sentences from character strings into strings of words. This paper sets out to study critical tokenization,...
原文链接:http://dl.acm.org/citation.cfm?id=972799
送人玫瑰,手留余香~如您已下载到该资源,可在回帖当中上传与大家共享,欢迎来CDA社区交流学习。(仅供学术交流用。)