1.

Record Nr.

UNINA9910451615203321

Autore

Wyschogrod Edith

Titolo

Crossover queries [[electronic resource] ] : dwelling with negatives, embodying philosophy's others / / Edith Wyschogrod

Pubbl/distr/stampa

New York, : Fordham University Press, 2006

ISBN

0-8232-3514-9

0-8232-4764-3

1-4294-7898-5

Edizione

[1st ed.]

Descrizione fisica

1 online resource (588 p.)

Collana

Perspectives in continental philosophy ; ; no. 52

Disciplina

190

Soggetti

Philosophy, Modern

Theology

Ethics

Electronic books.

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Description based upon print version of record.

Nota di bibliografia

Includes bibliographical references (p. 505-561) and index.

Nota di contenuto

pt. 1. God : desiring the infinite -- pt. 2. Training bodies : pedagogies of pain -- pt. 3. Bodies : subject or code -- pt. 4. Nihilation and the ethics of alterity -- pt. 5. Conversations -- pt. 6. The art in ethics -- pt. 7. Comparing philosophies.

Sommario/riassunto

Exploring the risks, ambiguities, and unstable conceptual worlds of contemporary thought, this book brings together the wide-ranging writings, across twenty years, of one of our most important philosophers - Wyschogrod.



2.

Record Nr.

UNISA996466396403316

Autore

Ahlswede Rudolf <1938->

Titolo

Identification and other probabilistic models : Rudolf Ahlswede's lectures on information theory 6 / / Rudolf Ahlswede ; editors, Alexander Ahlswede [and three others]

Pubbl/distr/stampa

Cham, Switzerland : , : Springer, , [2021]

©2021

ISBN

3-030-65072-3

Descrizione fisica

1 online resource (720 pages)

Collana

Foundations in Signal Processing, Communications and Networking ; ; v.16

Disciplina

003.54

Soggetti

Information theory

Teoria de la informació

Llibres electrònics

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di bibliografia

Includes bibliographical references and index.

Nota di contenuto

Intro -- Words and Introduction of the Editors -- Preface -- Preamble -- Contents -- Notation and Abbreviations -- Part I Identification via Channels -- Identification via Channels -- 1 Results and Preliminaries -- 1.1 Notation and Known Facts -- 1.1.1 Entropy and Information Quantities -- 1.1.2 Channels, Empirical Distributions, Generated Sequences -- 1.1.3 Elementary Properties of Typical Sequences and Generated Sequences -- 1.1.4 Formulation of the Classical Transmission Problem -- 1.2 Formulation of the Identification Problem -- 1.2.1 The Double Exponent Coding Theorem -- 2 The Direct Parts of the Coding Theorems -- 3 The Strong Converses -- 3.1 Analytic Proof of the Strong Converse -- 3.1.1 Proof of Lemma 25 -- 3.2 Combinatorial Proof of the Strong Converse -- 4 Discussion -- References -- Identification in the Presence of Feedback: A Discovery of New Capacity Formulas -- 1 The Results -- 2 Notation and Known Facts -- 3 New Proof of the Direct Part in Theorem 12 -- 4 Proof of the Direct Part of Theorem 40 -- 5 Proof of the Direct Part of Theorem 41 -- 6 Proof of the Converse Part of Theorem 40 -- 7 Proof of the Converse Part of Theorem 41 -- References -- On Identification via Multi-Way Channels with Feedback: Mystery Numbers -- 1 Introduction



-- 2 Review of Known Concepts and Results -- 3 A General Model for Communication Systems -- 4 Classes of Feedback Strategies, Common Random Experiments and Their Mystery Numbers -- 5 Main Theorem and Consequences -- 6 A Method for Proving Converses in Case of Feedback -- 7 A 3-Step ID Scheme for the Noiseless BSC -- 8 Extension of the 3-Step ID Scheme to the DMC With and Without Feedback -- 9 Proof of Theorems 53 and 54 -- 10 Proof of Theorem 61, Optimality of Our Coding Scheme -- References -- Identification Without Randomization -- 1 Introduction and Results -- 2 Proof of Theorem 67.

3 Proof of Theorem 69 -- 4 Proof of Theorem 70 -- 5 Proof of Theorem 71 -- 6 Proof of Lemma 73 -- 7 Proof of Theorem 74 -- References -- Identification via Channels with Noisy Feedback -- 1 Introduction -- 2 Proof of Theorem 75 -- References -- Identification via Discrete Memoryless Wiretap Channels -- 1 Introduction -- 2 Proof of Theorem 87 -- References -- Part II A General Theory of Information Transfer -- Introduction -- References -- One Sender Answering Several Questions of Receivers -- 1 A General Communication Model for One Sender -- 2 Analysis of a Specific Model: K-Identification -- 3 Models with Capacity Equal to the Ordinary Capacity -- References -- Models with Prior Knowledge of the Receiver -- 1 Zero-error Decodable Hypergraphs -- 2 K-Separating Codes -- 3 Analysis of a Model with Specific Constraints: 2-Separation and Rényi's Entropy H2 -- 4 Binning via Channels -- 5 K-Identifiability, K-Separability and Related Notions -- References -- Models with Prior Knowledge at the Sender -- 1 Identification via Group Testing and a Stronger Form of the Rate-Distortion Theorem -- References -- Identification and Transmission with Multi-way Channels -- 1 Simultaneous Transfer: Transmission and Identification -- 2 A Proof of the Weak Converse to the Identification Coding Theorem for the DMC -- 3 Two Promised Results: Characterisation of the Capacity Regions for the MAC and the BC for Identification -- 4 The Proof for the MAC -- 5 The Proof for the BC -- References -- Data Compression -- 1 Noiseless Coding for Identification -- 2 Noiseless Coding for Multiple Purposes -- References -- Perspectives -- 1 Comparison of Identification Rate and Common Randomness Capacity: Identification Rate can Exceed Common Randomness Capacity and Vice Versa -- 2 Robustness, Common Randomness and Identification.

3 Beyond Information Theory: Identification as a New Concept of Solution for Probabilistic Algorithms -- References -- Part III Identification, Mystery Numbers, or Common Randomness -- The Role of Common Randomness in Information Theory and Cryptography: Secrecy Constraints -- 1 Introduction -- 2 Generating a Shared Secret Key When the Third Party Has No Side Information -- 3 Secret Sharing When the Third Party Has Side Information -- 4 Proofs -- 5 Conclusions -- References -- Common Randomness in Information Theory and Cryptography CR Capacity -- 1 Introduction -- 2 Preliminaries -- 2.1 Model (i): Two-Source with One-Way Communication -- 2.2 Model (ii): DMC with Active Feedback -- 2.3 Model (iii): Two-Source with Two-Way Noiseless Communication -- 2.4 Models with Robust CR -- 3 Some General Results -- 4 Common Randomness in Models (i), (ii), and (iii) -- 5 Common Randomness, Identification, and Transmission for Arbitrarily Varying Channels -- 5.1 Model (A): AVC Without Feedback and Any Other Side Information -- 5.2 Model (B): AVC with Noiseless (Passive) Feedback -- 5.3 Model (C): Strongly Arbitrarily Varying Channel (SAVC) -- References -- Watermarking Identification Codes with Related Topics on Common Randomness -- 1 Introduction -- 2 The Notation -- 3 The Models --



3.1 Watermarking Identification Codes -- 3.2 The Common Randomness -- 3.3 The Models for Compound Channels -- 4 The Results -- 4.1 The Results on Common Randomness -- 4.2 The Results on Watermarking Identification Codes -- 4.3 A Result on Watermarking Transmission Code with a Common Experiment Introduced by Steinberg-Merhav -- 5 The Direct Theorems for Common Randomness -- 6 The Converse Theorems for Common Randomness -- 7 Construction of Watermarking Identification Codes from Common Randomness -- 8 A Converse Theorem of a Watermarking Coding Theorem Due to Steinberg-Merhav.

References -- Transmission, Identification and Common Randomness Capacities for Wire-Tap Channels with Secure Feedback from the Decoder -- 1 Introduction -- 2 Notation and Definitions -- 3 Previous and Auxiliary Results -- 4 The Coding Theorem for Transmission and Its Proof -- 5 Capacity of Two Special Families of Wire-Tap Channels -- 6 Discussion: Transmission, Building Common Randomness and Identification -- 7 The Secure Common Randomness Capacity in the Presenceof Secure Feedback -- 8 The Secure Identification Capacity in the Presenceof Secure Feedback -- References -- Secrecy Systems for Identification Via Channels with Additive-Like Instantaneous Block Encipherer -- 1 Introduction -- 2 Background -- 3 Model -- 4 Main Result -- References -- Part IV Identification for Sources, Identification Entropy, and Hypothesis Testing -- Identification for Sources -- 1 Introduction -- 1.1 Pioneering Model -- 1.1.1 Further Models and Definitions -- 2 A Probabilistic Tool for Generalized Identification -- 3 The Uniform Distribution -- 4 Bounds on L(P) for General P=(P1,…,PN) -- 4.1 An Upper Bound -- 5 An Average Identification Length -- 5.1 Q is the Uniform Distribution on V=U -- 5.2 The Example Above in Model GID with Average Identification Length for a Uniform Q* -- References -- Identification Entropy -- 1 Introduction -- 2 Noiseless Identification for Sources and Basic Concept  of Performance -- 3 Examples for Huffman Codes -- 4 An Identification Code Universally Good for all P  on U={1,2,…,N} -- 5 Identification Entropy HI(P) and Its Role as Lower Bound -- 6 On Properties of (PN) -- 6.1 A First Idea -- 6.2 A Rearrangement -- 7 Upper Bounds on (PN) -- 8 The Skeleton -- 9 Directions for Research -- References -- An Interpretation of Identification Entropy -- 1 Introduction -- 1.1 Terminology -- 1.2 A New Terminology Involving Proper Common Prefices.

1.3 Matrix Notation -- 2 An Operational Justification of ID-Entropy asLower Bound for LC(P,P) -- 3 An Alternative Proof of the ID-Entropy Lower Bound for LC(P,P) -- 4 Sufficient and Necessary Conditions for a Prefix Code C to Achieve the ID-Entropy Lower Bound of LC(P,P) -- 5 A Global Balance Principle to Find Good Codes -- 6 Comments on Generalized Entropies -- References -- L-Identification for Sources -- 1 Introduction -- 2 Definitions and Notation -- 2.1 Source Coding and Code Trees -- 2.2 L-Identification -- 3 Two New Results for (1-)Identification -- 3.1 (1-)Identification for Block Codes -- 3.2 An Improved Upper Bound for Binary Codes -- 4 L-Identification for the Uniform Distribution -- 4.1 Colexicographic Balanced Huffman Trees -- 4.2 An Asymptotic Theorem -- 5 Two-Identification for General Distributions -- 5.1 An Asymptotic Approach -- 5.2 The q-ary Identification Entropy of Second Degree -- 5.3 An Upper Bound for Binary Codes -- 6 L-Identification for General Distributions -- 7 L-Identification for Sets -- 8 Open Problems -- 8.1 Induction Base for the Proof of Proposition 243 -- 8.2 L-Identification for Block Codes -- 8.3 L-Identification for Sets for General Distributions -- Appendix -- References -- Testing of Hypotheses and Identification -- 1 Preliminaries: Testing of Hypotheses and L1-Distance -- 2 Measures



Separated in L1-Metrics -- 3 Identification Codes or ``How Large is the Set of all Output Measures for Noisy Channel?'' -- Appendix -- References -- On Logarithmically Asymptotically Optimal Testing of Hypotheses and Identification -- 1 Problem Statement -- 2 Background -- 3 Identification Problem for Model with Independent Objects -- 4 Identification Problem for Models with Different Objects -- 5 Identification of the Probability Distribution of an Object -- 6 r-Identification and Ranking Problems.

7 Conclusion and Extensions of Problems.

3.

Record Nr.

UNINA9910485040203321

Autore

Herrmann Leonhard

Titolo

Literarische Vernunftkritik im Roman der Gegenwart / / von Leonhard Herrmann

Pubbl/distr/stampa

Stuttgart : , : J.B. Metzler : , : Imprint : J.B. Metzler, , 2017

ISBN

3-476-04351-7

Edizione

[1st ed. 2017.]

Descrizione fisica

1 online resource (IX, 366 S.)

Disciplina

809

Soggetti

Literature, Modern—20th century

Literature, Modern—21st century

Contemporary Literature

Lingua di pubblicazione

Tedesco

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di contenuto

Dank -- Einleitung: Was will der Roman der Gegenwart? -- I. Voraussetzungen literarischer Vernunftkritik: Philosophische Vernunftkritik und ihre Aporien -- 1. Vorgeschichte der Vernunftkritik -- 2. Horkheimer/Adorno: Vernunft- als Aufklärungskritik -- 3. Heideggers Fundamentalontologie als Metaphysikkritik -- 4. Gadamers Hermeneutik (und ihre Kontinuitäten) -- 5. Blumenbergs Metaphorologie als ›Höhlenausgang‹ -- 6. Vernunftkritik in Sprachphilosophie und Konstruktivismus -- 7. Die ›andere‹ Vernunft in Poststrukturalismus und ›Postmoderne‹ -- 8. Diskursivierung, Pluralisierung und Ästhetisierung als ›Rettung‹ von Rationalität: Habermas, Davidson, Seel -- 9. Rational – irrational – nicht-rational: Vernunftkritik in der literarischen Kommunikation der Gegenwart -- II.



Systematik literarischer Vernunftkritik -- 1. Erzähltheorie als ›Logik der Dichtung‹ -- 2. Fiktionalität und Vernunftkritik -- 3. Unzuverlässiges Erzählen, Fokalisierung und Multiperspektivität -- 4. Fantastisches Erzählen -- 5. Unnatürliches und metaleptisches Erzählen -- III. Formen literarischer Vernunftkritik -- 1. Gelehrtenromane und die Grenzen der Vernunft -- 2. Postapokalytische Romane und die literarische ›Zeitigung der Zeitlichkeit‹ -- 3. Gesellschaftsromane und die Kritik der ökonomischen Vernunft -- 4. Erinnerungsromane und die Kritik der historiografischen Vernunft -- 5. Reiseromane und die Kritik der ›kartografischen Vernunft‹ -- Schluss: Literarische Vernunftkritik und ihre Aporien -- Anhang.

Sommario/riassunto

Die Arbeit analysiert zentrale deutschsprachige Romane der Gegenwart im Hinblick auf ihr Verhältnis zur Vernunft. Autorinnen und Autoren wie Daniel Kehlmann und Sibylle Lewitscharoff, Thomas Glavinic und Thomas Lehr, Terézia Mora und Ernst-Wilhelm Händler, Christoph Ransmayr und Raoul Schrott, Michael Köhlmeier und Marcel Beyer greifen mit ihren Texten die vielfältigen Vernunft-Diskurse des 20. Jahrhunderts auf und führen diese mit den Mitteln fiktionalen Erzählens fort. Anders als in der Philosophie gilt ihnen die Einsicht in die Grenzen der Vernunft nicht als Ergebnis abermaliger vernünftiger Reflexion, sondern als eine ästhetische Wirkung des literarischen Kunstwerks. Dieser Anspruch, der nur graduell realisierbar ist, verbindet deutschsprachige Gegenwartsromane mit vielfältigen Traditionen seit dem ausgehenden 18. Jahrhundert. .