Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Abstraction, mimesis and the evolution of deep learning
Department of Computer Science, University of Copenhagen (DNK) ,GKN Aerospace Engines, Trollhättan (SWE).
Department of Computer Science, University of Copenhagen (DNK).
Center Leo Apostel (CLEA), Vrije Universiteit, Brussels (BEL).
GKN Aerospace Engines, Trollhättan (SWE).
Visa övriga samt affilieringar
2023 (Engelska)Ingår i: AI & Society: The Journal of Human-Centred Systems and Machine Intelligence, ISSN 0951-5666, E-ISSN 1435-5655, s. 1-9Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Deep learning developers typically rely on deep learning software frameworks (DLSFs)—simply described as pre-packaged libraries of programming components that provide high-level access to deep learning functionality. New DLSFs progressively encapsulate mathematical, statistical and computational complexity. Such higher levels of abstraction subsequently make it easier for deep learning methodology to spread through mimesis (i.e., imitation of models perceived as successful). In this study, we quantify this increase in abstraction and discuss its implications. Analyzing publicly available code from Github, we found that the introduction of DLSFs correlates both with significant increases in the number of deep learning projects and substantial reductions in the number of lines of code used. We subsequently discuss and argue the importance of abstraction in deep learning with respect to ephemeralization, technological advancement, democratization, adopting timely levels of abstraction, the emergence of mimetic deadlocks, issues related to the use of black box methods including privacy and fairness, and the concentration of technological power. Finally, we also discuss abstraction as a symptom of an ongoing technological metatransition.

Ort, förlag, år, upplaga, sidor
Springer Nature, 2023. s. 1-9
Nyckelord [en]
Deep learning · Evolution of deep learning · Abstraction · Mimesis
Nationell ämneskategori
Lärande Systemvetenskap, informationssystem och informatik med samhällsvetenskaplig inriktning
Forskningsämne
Arbetsintegrerat lärande
Identifikatorer
URN: urn:nbn:se:hv:diva-20135DOI: 10.1007/s00146-023-01688-zISI: 000999202200001Scopus ID: 2-s2.0-85160720644OAI: oai:DiVA.org:hv-20135DiVA, id: diva2:1776327
Anmärkning

CC BY 4.0

Open access funding provided by Royal Danish Library.

Tillgänglig från: 2023-06-28 Skapad: 2023-06-28 Senast uppdaterad: 2024-01-02

Open Access i DiVA

fulltext(820 kB)83 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 820 kBChecksumma SHA-512
89034dbe8c3b91bf345a8225c42b61750c8dc1c54d8a31cfa741f76e69e4c56347e7fce524ef9df5737930a1d120e89c0d6870d2f4d3e6a1a29e148967de59c9
Typ fulltextMimetyp application/pdf

Övriga länkar

Förlagets fulltextScopus

Person

Lundh Snis, Ulrika

Sök vidare i DiVA

Av författaren/redaktören
Lundh Snis, Ulrika
Av organisationen
Avd för informatik
I samma tidskrift
AI & Society: The Journal of Human-Centred Systems and Machine Intelligence
LärandeSystemvetenskap, informationssystem och informatik med samhällsvetenskaplig inriktning

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 83 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

doi
urn-nbn

Altmetricpoäng

doi
urn-nbn
Totalt: 133 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf