Search
Items tagged with: 7
Risk #1: LLMs can produce factually incorrect text.
Risk #2: LLMs can produce untrustworthy explanations.
Risk #3: LLMs can persuade and influence, and they can provide unhealthy advice.
Risk #4: LLMs can simulate feelings, personality, and relationships.
Risk #5: LLMs can change their outputs dramatically based on tiny changes in a conversation.
Risk #6: LLMs store your conversations and can use them as training data.
Risk #7: LLMs cannot attribute sources for the text they produce.