期刊文献+

大语言模型赋能数字出版的机遇与挑战 被引量:10

Opportunities and Challenges for Digital Publishing Enabled by Large Language Models
在线阅读 下载PDF
导出
摘要 以ChatGPT为代表的大语言模型广泛应用于对话生成、文本摘要生成、问答系统等场景,具有文字编辑、知识记忆的显著优势,能替代重复性和耗时性工作,为数字出版领域的写作、选题策划、稿件审阅与校对、数字人语料供给、智能客服与销售、机构管理与决策带来新机遇。同时,其在逻辑推理、文化关联、意义挖掘等方面的不足凸显,致使其内容生成尚未融通正确出版观,知识产权争议不断,还扩大了多元群体信息鸿沟,加剧形成“信息茧房”,加大内容价值甄别难度,抬高用户信息素养门槛。 Large Language Models(LLMs)as represented by ChatGPT are widely used in application scenarios such as conversation generation,text abstract generation,question and answer systems.They have significant advantages regarding text editing and knowledge memory,and so can often replace repetitive and time-consuming work.This brings new opportunities for writing,topic planning,manuscript review and proofreading.In addition,they bring advantages for corpus information provision of digital human,intelligent customer service and sales,institutional management and decision-making in digital publishing.Their weaknesses,such as logical reasoning,cultural associations,and determining meaning are also evident.They therefore result in content generation that has not yet been integrated with an appropriate view for publishing,which is biased or erroneous and is the subject of constant intellectual property disputes.Moreover,this widens the information gap between diverse groups,exacerbates the formation of information silos,makes it more difficult to identify the value of content,and raises the necessary threshold of information literacy for users.
作者 张宁 西蒙·马奥尼 ZHANG Ning;Simon Mahony(Faculty of Arts and Sciences,Beijing Normal University,Beijing 100091,China;College of Education for the Future,Beijing Normal University,Beijing 100091,China;.3.Department of Information Studies,University College London,London W55RF,UK)
出处 《编辑之友》 CSSCI 北大核心 2023年第11期45-51,共7页 Editorial Friend
基金 广东省教育科学规划项目“面向大学生传统文化教育的VR古籍游戏化系统模型与实践研究”(2022GXJK416)。
关键词 数字出版 大语言模型 ChatGPT 人工智能 出版业 digital publishing large language model ChatGPT artificial intelligence publishing industry
  • 相关文献

二级参考文献72

同被引文献260

引证文献10

二级引证文献19

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部