TestingCatalog News (@testingcatalog)

Anthropic이 Claude Cowork용 'Knowledge Bases'를 개발 중이라는 소식입니다. KB는 주제별 지속 지식 저장소로, Claude가 이를 자동으로 관리하는 '토픽별 메모리' 개념으로 보이며 내부 지시 문구에서는 이를 'persistent knowledge repositories'로 설명합니다. 해당 기능은 Claude의 컨텍스트 유지·개인화에 중요한 변화를 예고합니다.

https://x.com/testingcatalog/status/2012891786226626919

#anthropic #claude #knowledgebases #ai #memory

TestingCatalog News 🗞 (@testingcatalog) on X

BREAKING 🚨: Anthropic is working on "Knowledge Bases" for Claude Cowork. KBs seem to be a new concept of topic-specific memories, which Claude will automatically manage! And a bunch of other new things. Internal Instruction 👀 "These are persistent knowledge repositories.

X (formerly Twitter)

#lispyGopherClimate Archive pending
https://communitymedia.video/c/screwtape_channel/videos
@kentpitman answering some of the questions and comments in this thread:
https://gamerplus.org/@screwlisp/115533182674747915

- several notes on islisp https://islisp.info/

- Tool reuse versus new #programming
- (modularity versus globality)

- Learning using only symbols?

- questions thread included a discussion of @JohnMashey viz unix

there will be a part 2

#LISP #ISlisp #knowledgeBases #languageChoice #programmingLanguage

Zetteldeft (together with Org-Journal) helped me record and organise a lot of information. However, last year its development stopped and I've learned about Denote.

Today I'm figuring out how to migrate my knowledge-base to the other tool.

#Emacs #Zettelkasten #KnowledgeBases

Anybody here know Joe Biden? Or any high-ranking federal figures? I want to encourage them to immediately begin safeguarding critical information in these precious two months before the Project 2025 flunkies start erasing files.

#archive #knowledgebases #libraryofcongress #nationalarchive

Submit New Content Providers to Ideas Exchange through November 13, 2024 – IGeLU

Large language models converge toward human-like concept organization
https://arxiv.org/abs/2308.15047

* large language models show human-like performance in knowledge extraction, reasoning and dialogue
* LLM organize concepts strikingly similar to how concepts organized in KB such as WikiData
* KB model collective, institutional knowledge
* LLM seem to induce such knowledge from raw text

#LLM #LargeLanguageModels #concepts #epistemology #semantics #KnowledgeBases #WikiData #KnowledgeGraph

Large language models converge toward human-like concept organization

Large language models show human-like performance in knowledge extraction, reasoning and dialogue, but it remains controversial whether this performance is best explained by memorization and pattern matching, or whether it reflects human-like inferential semantics and world knowledge. Knowledge bases such as WikiData provide large-scale, high-quality representations of inferential semantics and world knowledge. We show that large language models learn to organize concepts in ways that are strikingly similar to how concepts are organized in such knowledge bases. Knowledge bases model collective, institutional knowledge, and large language models seem to induce such knowledge from raw text. We show that bigger and better models exhibit more human-like concept organization, across four families of language models and three knowledge graph embeddings.

arXiv.org

After the #summerschool is before the next lecture ;-)
In the last #ise2023 lecture we learned how to create simple models with RDFS and learned about the distinction of A-Box and T-Box for RDF #knowledgebases

#knowledgegraphs #RDF #knowledgebase #ai #isws2023 @fizise @KIT_Karlsruhe @enorouzi

How can I request new collections to be added to our Discovery Indexes, KnowledgeBases and the Alma Community Zone?

How can I request new collection and data to be added to the Alma Community Zone, Summon Index, Primo Central, 360 KB and SFX KnowledgeBase? At the Ex Libris Idea Exchange, the Content forum  is …

Ex Libris Knowledge Center