EasterBlack-owned or founded brands at TargetGroceryClothing, Shoes & AccessoriesBabyHomeFurnitureKitchen & DiningOutdoor Living & GardenToysElectronicsVideo GamesMovies, Music & BooksSports & OutdoorsBeautyPersonal CareHealthPetsHousehold EssentialsArts, Crafts & SewingSchool & Office SuppliesParty SuppliesLuggageGift IdeasGift CardsClearanceTarget New ArrivalsTarget Finds#TargetStyleTop DealsTarget Circle DealsWeekly AdShop Order PickupShop Same Day DeliveryRegistryRedCardTarget CircleFind Stores

Knowledge-Augmented Methods for Natural Language Processing - (Springerbriefs in Computer Science) (Hardcover)

Knowledge-Augmented Methods for Natural Language Processing - (Springerbriefs in Computer Science) (Hardcover) - 1 of 1
$49.99 when purchased online
Target Online store #3991

About this item

Highlights

  • Over the last few years, natural language processing has seen remarkable progress due to the emergence of larger-scale models, better training techniques, and greater availability of data.
  • About the Author: Dr. Meng Jiang is currently an assistant professor at the Department of Computer Science and Engineering in the University of Notre Dame.
  • Computers + Internet, Speech & Audio Processing
  • Series Name: Springerbriefs in Computer Science

Description



Book Synopsis



Over the last few years, natural language processing has seen remarkable progress due to the emergence of larger-scale models, better training techniques, and greater availability of data. Examples of these advancements include GPT-4, ChatGPT, and other pre-trained language models. These models are capable of characterizing linguistic patterns and generating context-aware representations, resulting in high-quality output. However, these models rely solely on input-output pairs during training and, therefore, struggle to incorporate external world knowledge, such as named entities, their relations, common sense, and domain-specific content. Incorporating knowledge into the training and inference of language models is critical to their ability to represent language accurately. Additionally, knowledge is essential in achieving higher levels of intelligence that cannot be attained through statistical learning of input text patterns alone. In this book, we will review recent developments in the field of natural language processing, specifically focusing on the role of knowledge in language representation. We will examine how pre-trained language models like GPT-4 and ChatGPT are limited in their ability to capture external world knowledge and explore various approaches to incorporate knowledge into language models. Additionally, we will discuss the significance of knowledge in enabling higher levels of intelligence that go beyond statistical learning on input text patterns. Overall, this survey aims to provide insights into the importance of knowledge in natural language processing and highlight recent advances in this field.




From the Back Cover



Over the last few years, natural language processing has seen remarkable progress due to the emergence of larger-scale models, better training techniques, and greater availability of data. Examples of these advancements include GPT-4, ChatGPT, and other pre-trained language models. These models are capable of characterizing linguistic patterns and generating context-aware representations, resulting in high-quality output. However, these models rely solely on input-output pairs during training and, therefore, struggle to incorporate external world knowledge, such as named entities, their relations, common sense, and domain-specific content. Incorporating knowledge into the training and inference of language models is critical to their ability to represent language accurately. Additionally, knowledge is essential in achieving higher levels of intelligence that cannot be attained through statistical learning of input text patterns alone. In this book, we will review recent developments in the field of natural language processing, specifically focusing on the role of knowledge in language representation. We will examine how pre-trained language models like GPT-4 and ChatGPT are limited in their ability to capture external world knowledge and explore various approaches to incorporate knowledge into language models. Additionally, we will discuss the significance of knowledge in enabling higher levels of intelligence that go beyond statistical learning on input text patterns. Overall, this survey aims to provide insights into the importance of knowledge in natural language processing and highlight recent advances in this field.



About the Author



Dr. Meng Jiang is currently an assistant professor at the Department of Computer Science and Engineering in the University of Notre Dame. He obtained his B.E. and Ph.D. from Tsinghua University. He spent two years in UIUC as a postdoc and joined ND in 2017. His research interests include data mining, machine learning, and natural language processing. He has published more than 100 peer-reviewed papers of these topics. He is the recipient of the Notre Dame International Faculty Research Award. The honors and awards he received include Best Paper Finalist in KDD 2014, Best Paper Award in KDD-DLG 2020, and ACM SIGSOFT Distinguished Paper Award in ICSE 2021. He received NSF CRII Award in 2019 and CAREER Award in 2022.

Bill Yuchen Lin is a postdoctoral young investigator at Allen Institute for AI (AI2), advised by Prof. Yejin Choi. He received his PhD from University of Southern California in 2022, advised by Prof. Xiang Ren. His research goal is to teach machines to think, talk, and act with commonsense knowledge and commonsense reasoning ability as humans do. Towards this ultimate goal, he has been developing knowledge-augmented reasoning methods (e.g., KagNet, MHGRN, DrFact) and constructing benchmark datasets (e.g., CommonGen, RiddleSense, X-CSR) that require commonsense knowledge and complex reasoning for both NLU and NLG. He initiated an online compendium of commonsense reasoning research, which serves as a portal for the community.

Dr. Shuohang Wang is a senior researcher in the Knowledge and Language Team of Cognitive Service Research Group. His research mainly focuses on question answering, multilingual NLU, summarization with deep learning, reinforcement learning, and few-shot learning. He served as area chair or senior PC member for ACL, EMNLP, and AAAI. He co-organized AAAI'23 workshop on Knowledge Augmented Methods for NLP.

Dr. Yichong Xu is a senior researcher in the Knowledge and Language Team of Cognitive Service Research Group. His research focuses on the combination of knowledge and NLP, with applications to question answering, summarization, and multimodal learning. He led the effort to achieve the human parity on the CommonsenseQA benchmark. He has held tutorials on knowledge-augmented NLP methods in ACL and WSDM. Prior to joining Microsoft, Dr. Xu got his Ph.D. in machine learning from Carnegie Mellon University.

Wenhao Yu is a Ph.D. candidate in the Department of Computer Science and Engineering at the University of Notre Dame. His research lies in language model + knowledge for solving knowledge-intensive applications, such as open-domain question answering and commonsense reasoning. He has published over 15 conference papers and presented 3 tutorials in machine learning and natural language processing conferences, including ICLR, ICML, ACL, and EMNLP. He was the recipient of Bloomberg Ph.D. Fellowship in 2022 and won the Best Paper Award at SoCal NLP in 2022. He was a research intern in Microsoft Research and Allen Institute for AI.

Dr. Chenguang Zhu is a principal research manager in Microsoft Cognitive Services Research Group, where he leads the Knowledge and Language Team. His research covers knowledge-enhanced language model, text summarization, and prompt learning. Dr. Zhu has led teams to achieve human parity in CommonsenseQA, HellaSwag, and CoQA, and first places in CommonGen, FEVER, ARC, and SQuAD v1.0. He holds a Ph.D. degree in Computer Science from Stanford University. Dr. Zhu has published over 100 papers on NLP and knowledge-augmented methods. He has held tutorials and workshops in knowledge-augmented NLP in conferences like ACL, AAAI, and WSDM. He has published the book Machine Reading Comprehension: Algorithm and Practice published in Elsevier.


Dimensions (Overall): 9.25 Inches (H) x 6.1 Inches (W)
Suggested Age: 22 Years and Up
Genre: Computers + Internet
Sub-Genre: Speech & Audio Processing
Series Title: Springerbriefs in Computer Science
Publisher: Springer
Format: Hardcover
Author: Meng Jiang & Bill Yuchen Lin & Shuohang Wang & Yichong Xu & Wenhao Yu & Chenguang Zhu
Language: English
Street Date: April 11, 2024
TCIN: 1004618308
UPC: 9789819707461
Item Number (DPCI): 247-44-2960
Origin: Made in the USA or Imported
If the item details above aren’t accurate or complete, we want to know about it.

Shipping details

Estimated ship dimensions: 1 inches length x 6.1 inches width x 9.25 inches height
Estimated ship weight: 1 pounds
We regret that this item cannot be shipped to PO Boxes.
This item cannot be shipped to the following locations: American Samoa (see also separate entry under AS), Guam (see also separate entry under GU), Northern Mariana Islands, Puerto Rico (see also separate entry under PR), United States Minor Outlying Islands, Virgin Islands, U.S., APO/FPO

Return details

This item can be returned to any Target store or Target.com.
This item must be returned within 90 days of the date it was purchased in store, shipped, delivered by a Shipt shopper, or made ready for pickup.
See the return policy for complete information.

Trending Computers & Technology Books

Planes by Byron Barton (Board Book)

$7.99
Buy 2 Get 1 Free Books, Movies, Music and Funko

The Poor Man's James Bond (vol. 1) - by Kurt Saxon

$31.99 - $40.56
MSRP $39.99 - $49.99
Buy 2 Get 1 Free Books, Movies, Music and Funko

Nexus - by  Yuval Noah Harari (Hardcover)

$21.71
MSRP $35.00
Buy 2 Get 1 Free Books, Movies, Music and Funko
5 out of 5 stars with 1 ratings

The Coming Wave - by Mustafa Suleyman

$16.99 - $18.83
MSRP $20.00 - $32.50
Buy 2 Get 1 Free Books, Movies, Music and Funko

Related Categories

Get top deals, latest trends, and more.

Privacy policy

Footer

About Us

About TargetCareersNews & BlogTarget BrandsBullseye ShopSustainability & GovernancePress CenterAdvertise with UsInvestorsAffiliates & PartnersSuppliersTargetPlus

Help

Target HelpReturnsTrack OrdersRecallsContact UsFeedbackAccessibilitySecurity & FraudTeam Member Services

Stores

Find a StoreClinicPharmacyTarget OpticalMore In-Store Services

Services

Target Circle™Target Circle™ CardTarget Circle 360™Target AppRegistrySame Day DeliveryOrder PickupDrive UpFree 2-Day ShippingShipping & DeliveryMore Services
PinterestFacebookInstagramXYoutubeTiktokTermsCA Supply ChainPrivacyCA Privacy RightsYour Privacy ChoicesInterest Based AdsHealth Privacy Policy