Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Abstraction, mimesis and the evolution of deep learning
Department of Computer Science, University of Copenhagen (DNK) ,GKN Aerospace Engines, Trollhättan (SWE).
Department of Computer Science, University of Copenhagen (DNK).
Center Leo Apostel (CLEA), Vrije Universiteit, Brussels (BEL).
GKN Aerospace Engines, Trollhättan (SWE).
Show others and affiliations
2023 (English)In: AI & Society: The Journal of Human-Centred Systems and Machine Intelligence, ISSN 0951-5666, E-ISSN 1435-5655, p. 1-9Article in journal (Refereed) Published
Abstract [en]

Deep learning developers typically rely on deep learning software frameworks (DLSFs)—simply described as pre-packaged libraries of programming components that provide high-level access to deep learning functionality. New DLSFs progressively encapsulate mathematical, statistical and computational complexity. Such higher levels of abstraction subsequently make it easier for deep learning methodology to spread through mimesis (i.e., imitation of models perceived as successful). In this study, we quantify this increase in abstraction and discuss its implications. Analyzing publicly available code from Github, we found that the introduction of DLSFs correlates both with significant increases in the number of deep learning projects and substantial reductions in the number of lines of code used. We subsequently discuss and argue the importance of abstraction in deep learning with respect to ephemeralization, technological advancement, democratization, adopting timely levels of abstraction, the emergence of mimetic deadlocks, issues related to the use of black box methods including privacy and fairness, and the concentration of technological power. Finally, we also discuss abstraction as a symptom of an ongoing technological metatransition.

Place, publisher, year, edition, pages
Springer Nature, 2023. p. 1-9
Keywords [en]
Deep learning · Evolution of deep learning · Abstraction · Mimesis
National Category
Learning Information Systems, Social aspects
Research subject
Work Integrated Learning
Identifiers
URN: urn:nbn:se:hv:diva-20135DOI: 10.1007/s00146-023-01688-zISI: 000999202200001Scopus ID: 2-s2.0-85160720644OAI: oai:DiVA.org:hv-20135DiVA, id: diva2:1776327
Note

CC BY 4.0

Open access funding provided by Royal Danish Library.

Available from: 2023-06-28 Created: 2023-06-28 Last updated: 2024-01-02

Open Access in DiVA

fulltext(820 kB)71 downloads
File information
File name FULLTEXT01.pdfFile size 820 kBChecksum SHA-512
89034dbe8c3b91bf345a8225c42b61750c8dc1c54d8a31cfa741f76e69e4c56347e7fce524ef9df5737930a1d120e89c0d6870d2f4d3e6a1a29e148967de59c9
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Authority records

Lundh Snis, Ulrika

Search in DiVA

By author/editor
Lundh Snis, Ulrika
By organisation
Divison of Informatics
In the same journal
AI & Society: The Journal of Human-Centred Systems and Machine Intelligence
LearningInformation Systems, Social aspects

Search outside of DiVA

GoogleGoogle Scholar
Total: 71 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 125 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf